Usage of activations
Activations can either be used through an Activation
layer, or through the activation
argument supported by all forward layers:
from keras.layers import Activation, Dense
model.add(Dense(64))
model.add(Activation('tanh'))
This is equivalent to:
model.add(Dense(64, activation='tanh'))
You can also pass an element-wise TensorFlow/Theano/CNTK function as an activation:
from keras import backend as K
model.add(Dense(64, activation=K.tanh))
model.add(Activation(K.tanh))
Available activations
elu
elu(x, alpha=1.0)
selu
selu(x)
Scaled Exponential Linear Unit. (Klambauer et al., 2017)
Arguments
- x: A tensor or variable to compute the activation function for.
References
softplus
softplus(x)
softsign
softsign(x)
relu
relu(x, alpha=0.0, max_value=None)
tanh
tanh(x)
sigmoid
sigmoid(x)
hard_sigmoid
hard_sigmoid(x)
linear
linear(x)
softmax
softmax(x, axis=-1)
Softmax activation function.
Arguments
x : Tensor. - axis: Integer, axis along which the softmax normalization is applied.
Returns
Tensor, output of softmax transformation.
Raises
- ValueError: In case
dim(x) == 1
.
On "Advanced Activations"
Activations that are more complex than a simple TensorFlow/Theano/CNTK function (eg. learnable activations, which maintain a state) are available as Advanced Activation layers, and can be found in the module keras.layers.advanced_activations
. These include PReLU
and LeakyReLU
.