Description Usage Arguments Details Value Examples
This function computes the Laurae's Kullback-Leibler Error loss per value provided x, y
(preds, labels) values.
1 | loss_LKL_math(x, y)
|
x |
The |
y |
The |
This loss function is strictly positive, therefore defined in \]0, +Inf\[
. It penalizes lower values more heavily, and as such is a good fit for typical problems requiring fine tuning when undercommitting on the predictions. Compared to Laurae's Poisson loss function, Laurae's Kullback-Leibler loss has much higher loss. This loss function is experimental.
Loss Formula : (y_true - y_pred) * log(y_true / y_pred)
Gradient Formula : -((y_true - y_pred)/y_pred + log(y_true) - log(y_pred))
Hessian Formula : ((y_true - y_pred)/y_pred + 2)/y_pred
The Laurae's Kullback-Leibler Error per value.
1 2 3 4 5 | ## Not run:
SymbolicLoss(fc = loss_LKL_math, fc_ref = loss_MSE_math, xmin = 1, xmax = 100, y = rep(30, 21))
SymbolicLoss(fc = loss_LKL_math, fc_ref = loss_Poisson_math, xmin = 1, xmax = 100, y = rep(30, 21))
## End(Not run)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.