activator: Activator objects and nonlinear activation functions

Description Usage Format Details Note References Examples

Description

Activator objects and nonlinear activation functions

Usage

1
2
3
4
5
6
7
8
9

Format

An object of class activator of length 4, containing the name of the activation function (a character vector) and three functions relating to neural network nonlinearities:

Details

The following activators/activation functions are currently included:

The sigmoid activation function used to be the most common in neural networks, and acts as a logit link for binomial error functions. Most modern networks use relus, although Clevert et al. (2015) have recently shown that elus can work better because it can produce negative values.

Note

The C++ code used to speed up some activation functions (relu and elu) assumes that they are passed matrix objects, rather than vectors.

References

Clevert, Djork-Arné, Thomas Unterthiner, and Sepp Hochreiter. "Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)." arXiv preprint arXiv:1511.07289 (2015). Harvard

Nair, Vinod, and Geoffrey E. Hinton. "Rectified linear units improve restricted boltzmann machines." In Proceedings of the 27th International Conference on Machine Learning (ICML-10), pp. 807-814. 2010.

Examples

1
2
3
4
5
6
7
8
x = matrix(seq(-4, 4, .01), ncol = 1)
par(mfrow = c(2, 3))
plot(x = x, y = elu_activator$f(x), type = "l", main = "elu")
plot(x = x, y = exp_activator$f(x), type = "l", main = "exp")
plot(x = x, y = identity_activator$f(x), type = "l", main = "identity")
plot(x = x, y = relu_activator$f(x), type = "l", main = "relu")
plot(x = x, y = sigmoid_activator$f(x), type = "l", main = "sigmoid")
abline(h = 0, col = "gray")

davharris/mistnet2 documentation built on May 14, 2019, 9:28 p.m.