Description Usage Arguments Details Value References
Activations functions can either be used through layer_activation()
, or
through the activation argument supported by all forward layers.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 | activation_relu(x, alpha = 0, max_value = NULL, threshold = 0)
activation_elu(x, alpha = 1)
activation_selu(x)
activation_hard_sigmoid(x)
activation_linear(x)
activation_sigmoid(x)
activation_softmax(x, axis = -1)
activation_softplus(x)
activation_softsign(x)
activation_tanh(x)
activation_exponential(x)
|
x |
Tensor |
alpha |
Alpha value |
max_value |
Max value |
threshold |
Threshold value for thresholded activation. |
axis |
Integer, axis along which the softmax normalization is applied |
activation_selu()
to be used together with the initialization "lecun_normal".
activation_selu()
to be used together with the dropout variant "AlphaDropout".
Tensor with the same shape and dtype as x
.
activation_selu()
: Self-Normalizing Neural Networks
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.