KullbackLeibler: Kullback-Leibler Divergence

Description Usage Arguments Details Value See Also

Description

Compute the Kullback-Leibler divergence of one probability distribution from another. This divergence is also known as the relative entropy, the information deviation, and the information gain.

Usage

1
2
## S4 method for signature 'Distribution,Distribution'
KullbackLeibler(p1, p2)

Arguments

p1, p2

Distributions.

Details

Let p1 and p2 denote the vectors of probability mass assigned by two distributions defined on the same state space; furthermore, let these distributions be strictly positively-valued. Then, the Kullback-Leibler divergence of p2 from p1 is given by sum(p1 * log(p1 / p2)).

Note that the terminology "divergence of p2 from p1" indicates that p1 is the reference distribution against which the distribution p2 is evaluated.

Kullback-Leibler divergence is not a symmetric function. That is, it is not generally true that KullbackLeibler(p1, p2) = KullbackLeibler(p2, p1). Symmetric functions based on the Kullback-Leibler divergence are available through the Jeffrey and Topsoe distance.

Value

The Kullback-Leibler divergence of p2 from p1.

See Also

Jensen-Shannon, Jeffrey, Topsoe


patrickreidy/distdist documentation built on May 22, 2019, 12:40 p.m.