actF: Activation functions and their first order derivatives

Description Usage Arguments Details Value Methods (by generic) See Also

Description

A collection of activation functions and their first order derivatives used in deep neural networks

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
sigmoid(x)

sigmoid_(x)

tanh(x)

tanh_(x)

relu(x)

relu_(x)

prelu(x, a = 0.2)

prelu_(x, a = 0.2)

elu(x, a = 1)

elu_(x, a = 1)

celu(x, a = 1)

celu_(x, a = 1)

Arguments

x

Input of the activation function

a

a or alpha in the function

alpha

A pre-specified numeric value less or equal to 1.

alpha

A pre-specified numeric value less or equal to 1.

Details

Sigmoid Function: sigmoid(x) = 1/(1+exp(-x))

Hyperbolic Tangent Function: tanh(x) = (exp(x) - exp(-x))/(exp(x) + exp(-x))

Rectified Linear Units: relu(x) = max(0, x)

Leaky ReLU: prelu(x, a) = max(x*a, x), (a<1)

Exponential Linear Units: elu(x, alpha) = max(alpha*(exp(x)-1), x), (alpha<=1)

Continuously Differentiable Exponential Linear Units celu(x, alpha) = max(alpha*(exp(x/alpha)-1), x)

Value

Returns the value after activation

Methods (by generic)

Sigmoid function.

First order derivative of Sigmoid function.

Tanh function.

First order derivative of tanh function.

ReLU.

First order derivative of ReLU.

Leaky ReLU.

First order derivative of leaky ReLU.

ELU.

First order derivative of ELU function.

CELU.

First order derivative of CELU function.

See Also

nn.regresser
nn.classifier


SkadiEye/dnnet documentation built on March 26, 2020, 8:13 a.m.