activation | R Documentation |
Different type of activation functions and the corresponding derivatives
sigmoid(x)
elu(x)
relu(x)
lrelu(x)
idu(x)
dsigmoid(y)
delu(y)
drelu(y)
dlrelu(y)
dtanh(y) #activation function tanh(x) is already available in R
x |
input of the activation function |
y |
input of the derivative of the activation function |
Each function returns either the activation function (e.g. sigmoid, relu) or its derivative (e.g. dsigmoid, drelu).
An activation function is applied to x and returns a matrix the same size as x. The detail formula for each activation function is:
sigmoid |
return 1/(1+exp(-x)) |
elu |
return x for x>0 and exp(x)-1 for x<0 |
relu |
return x for x>0 and 0 for x<0 |
lrelu |
return x for x>0 and 0.1*x for x<0 |
tanh |
return tanh(x) |
idu |
return (x) |
Bingshu E. Chen
bwdNN
,
fwdNN
,
dNNmodel
,
optimizerSGD
,
optimizerNAG
# Specify a dnn nodel with user define activation function in layer 2.
softmax = function(x) {log(1+exp(x))} # y = log(1+exp(x))
dsoftmax = function(y) {sigmoid(y)} # x = exp(y)/(1+exp(y))
model = dNNmodel(units=c(8, 6, 1), activation= c('relu', 'softmax', 'sigmoid'),
input_shape = c(3))
print(model)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.