Description Super class Public fields Methods See Also
Each activation module has a forward method that takes in a batch of pre-activations Z and returns a batch of activations A.
Each activation module has a backward method that takes in dLdA and returns dLdZ, with the exception of SoftMax, where we assume dLdZ is passed in.
neuralnetr::ClassModule
-> Sigmoid
A
the activation vector
forward()
Sigmoid$forward(Z)
Z
a vector of pre-activations
a vector of activations.
backward()
Sigmoid$backward(dLdA)
dLdA
vector of gradients.
a vector gradeints.
clone()
The objects of this class are cloneable with this method.
Sigmoid$clone(deep = FALSE)
deep
Whether to make a deep clone.
Other activation:
ReLU
,
SoftMax
,
Tanh
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.