Description Super class Public fields Methods See Also
Each activation module has a forward method that takes in a batch of pre-activations Z and returns a batch of activations A.
Each activation module has a backward method that takes in dLdA and returns dLdZ, with the exception of SoftMax, where we assume dLdZ is passed in.
neuralnetr::ClassModule -> ReLU
Athe activation vector
forward()ReLU$forward(Z)
Za vector of pre-activations
a vector of activations.
backward()ReLU$backward(dLdA)
dLdAvector of gradients.
a vector gradeints.
clone()The objects of this class are cloneable with this method.
ReLU$clone(deep = FALSE)
deepWhether to make a deep clone.
Other activation:
Sigmoid,
SoftMax,
Tanh
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.