View source: R/kl_divergence.R
kl_divergence | R Documentation |
Compute Kullback–Leibler Divergence (KLD) using confusion matrix. KL divergence basically just finds the difference between the entropies of the two distributions 'P(y|f)' and 'p(y)'. The inputs are assumed to be expressed in probabilistic terms.
kl_divergence(y_real, y_predicted)
y_real |
Observed values (integers) to compare with (in matrix format for multiclass classification). |
y_predicted |
Predicte values (probabiblities by class). |
integer value of Kullback–Leibler Divergence (KLD)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.