Description Super class Public fields Methods See Also
Each loss module has a forward method that takes in a batch of predictions Ypred (from the previous layer) and labels Y and returns a scalar loss value.
The NLL module has a backward method that returns dLdZ, the gradient with respect to the preactivation to SoftMax (note: not the activation!), since we are always pairing SoftMax activation with NLL loss
neuralnetr::ClassModule
-> NLL
Ypred
the predicted Y vector
Y
the actual Y vector
forward()
Calculate the loss.
NLL$forward(Ypred, Y)
Ypred
the predicted Y vector.
Y
the actual Y vector.
loss as a scalar
backward()
calculate the gradient w.r.t. the loss
NLL$backward()
dLdZ (? x b)
clone()
The objects of this class are cloneable with this method.
NLL$clone(deep = FALSE)
deep
Whether to make a deep clone.
Other loss:
SquaredLoss
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.