Description Usage Format Details Note References Examples
Activator objects and nonlinear activation functions
1 2 3 4 5 6 7 8 9 |
An object of class activator
of length 4, containing the name of the
activation function (a character vector) and three functions relating
to neural network nonlinearities:
f
: The activation function (nonlinearity) itself
grad
: The activation function's gradient with respect to
x
initialize_activator_biases
: A function used internally to
initialize the biases of a network's output layer
The following activators/activation functions are currently included:
elu: "exponential linear unit" f(x)=x
when x>0 and
f(x)=exp(x)-1
otherwise. See Clevert et al. (2015)
exp: "exponential unit" f(x)=exp(x)
; inverts a log link
identity: the identity function, f(x)=x
relu: "rectified linear unit" f(x)=x
when x>0 and
f(x)=0
otherwise. See Nair and Hinton (2010).
sigmoid: sigmoid (logistic) function, f(x)=1/(1 + exp(-x))
The sigmoid activation function used to be the most common in neural networks, and acts as a logit link for binomial error functions. Most modern networks use relus, although Clevert et al. (2015) have recently shown that elus can work better because it can produce negative values.
The C++ code used to speed up some activation functions (relu and elu)
assumes that they are passed matrix
objects, rather than vectors.
Clevert, Djork-Arné, Thomas Unterthiner, and Sepp Hochreiter. "Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)." arXiv preprint arXiv:1511.07289 (2015). Harvard
Nair, Vinod, and Geoffrey E. Hinton. "Rectified linear units improve restricted boltzmann machines." In Proceedings of the 27th International Conference on Machine Learning (ICML-10), pp. 807-814. 2010.
1 2 3 4 5 6 7 8 | x = matrix(seq(-4, 4, .01), ncol = 1)
par(mfrow = c(2, 3))
plot(x = x, y = elu_activator$f(x), type = "l", main = "elu")
plot(x = x, y = exp_activator$f(x), type = "l", main = "exp")
plot(x = x, y = identity_activator$f(x), type = "l", main = "identity")
plot(x = x, y = relu_activator$f(x), type = "l", main = "relu")
plot(x = x, y = sigmoid_activator$f(x), type = "l", main = "sigmoid")
abline(h = 0, col = "gray")
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.