actF: Activation functions and their first order derivatives

actFR Documentation

Activation functions and their first order derivatives

Description

A collection of activation functions and their first order derivatives used in deep neural networks

Usage

sigmoid(x)

sigmoid_(x)

tanh(x)

tanh_(x)

relu(x)

relu_(x)

prelu(x, a = 0.2)

prelu_(x, a = 0.2)

elu(x, a = 1)

elu_(x, a = 1)

celu(x, a = 1)

celu_(x, a = 1)

Arguments

x

Input of the activation function

a

a or alpha in the function

Details

Sigmoid Function: sigmoid(x) = 1/(1+exp(-x))

Hyperbolic Tangent Function: tanh(x) = (exp(x) - exp(-x))/(exp(x) + exp(-x))

Rectified Linear Units: relu(x) = max(0, x)

Leaky ReLU: prelu(x, a) = max(x*a, x), (a<1)

Exponential Linear Units: elu(x, alpha) = max(alpha*(exp(x)-1), x), (alpha<=1)

Continuously Differentiable Exponential Linear Units celu(x, alpha) = max(alpha*(exp(x/alpha)-1), x)

Value

Returns the value after activation

Methods (by generic)

Sigmoid function.

First order derivative of Sigmoid function.

Tanh function.

First order derivative of tanh function.

ReLU.

First order derivative of ReLU.

Leaky ReLU.

First order derivative of leaky ReLU.

ELU.

First order derivative of ELU function.

CELU.

First order derivative of CELU function.

See Also

dnnet


SkadiEye/deepTL documentation built on Nov. 17, 2022, 1:41 p.m.