Description Usage Arguments Details Value Methods (by generic) See Also
A collection of activation functions and their first order derivatives used in deep neural networks
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 |
x |
Input of the activation function |
a |
a or alpha in the function |
alpha |
A pre-specified numeric value less or equal to 1. |
alpha |
A pre-specified numeric value less or equal to 1. |
Sigmoid Function: sigmoid(x) = 1/(1+exp(-x))
Hyperbolic Tangent Function: tanh(x) = (exp(x) - exp(-x))/(exp(x) + exp(-x))
Rectified Linear Units: relu(x) = max(0, x)
Leaky ReLU: prelu(x, a) = max(x*a, x), (a<1)
Exponential Linear Units: elu(x, alpha) = max(alpha*(exp(x)-1), x), (alpha<=1)
Continuously Differentiable Exponential Linear Units celu(x, alpha) = max(alpha*(exp(x/alpha)-1), x)
Returns the value after activation
Sigmoid function.
First order derivative of Sigmoid function.
Tanh function.
First order derivative of tanh function.
ReLU.
First order derivative of ReLU.
Leaky ReLU.
First order derivative of leaky ReLU.
ELU.
First order derivative of ELU function.
CELU.
First order derivative of CELU function.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.