Description Super class Public fields Methods See Also
Each loss module has a forward method that takes in a batch of predictions Ypred (from the previous layer) and labels Y and returns a scalar loss value.
The NLL module has a backward method that returns dLdZ, the gradient with respect to the preactivation to SoftMax (note: not the activation!), since we are always pairing SoftMax activation with NLL loss
neuralnetr::ClassModule -> NLL
Ypredthe predicted Y vector
Ythe actual Y vector
forward()Calculate the loss.
NLL$forward(Ypred, Y)
Ypredthe predicted Y vector.
Ythe actual Y vector.
loss as a scalar
backward()calculate the gradient w.r.t. the loss
NLL$backward()
dLdZ (? x b)
clone()The objects of this class are cloneable with this method.
NLL$clone(deep = FALSE)
deepWhether to make a deep clone.
Other loss:
SquaredLoss
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.