tf.nn.elu
tf.nn.elu
tf.nn.elu
elu( features, name=None )
Defined in tensorflow/python/ops/gen_nn_ops.py
.
See the guide: Neural Network > Activation Functions
Computes exponential linear: exp(features) - 1
if < 0, features
otherwise.
See Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)
Args:
-
features
: ATensor
. Must be one of the following types:half
,float32
,float64
. -
name
: A name for the operation (optional).
Returns:
A Tensor
. Has the same type as features
.
© 2017 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/api_docs/python/tf/nn/elu