kl_divergence: Kullback–Leibler Divergence (KLD)

View source: R/kl_divergence.R

kl_divergenceR Documentation

Kullback–Leibler Divergence (KLD)

Description

Compute Kullback–Leibler Divergence (KLD) using confusion matrix. KL divergence basically just finds the difference between the entropies of the two distributions 'P(y|f)' and 'p(y)'. The inputs are assumed to be expressed in probabilistic terms.

Usage

kl_divergence(y_real, y_predicted)

Arguments

y_real

Observed values (integers) to compare with (in matrix format for multiclass classification).

y_predicted

Predicte values (probabiblities by class).

Value

integer value of Kullback–Leibler Divergence (KLD)


nikitagusarov/performancer documentation built on Jan. 12, 2023, 12:19 a.m.