Description Usage Arguments Details Value See Also
Compute the Kullback-Leibler divergence of one probability distribution from another. This divergence is also known as the relative entropy, the information deviation, and the information gain.
1 2 | ## S4 method for signature 'Distribution,Distribution'
KullbackLeibler(p1, p2)
|
p1, p2 |
|
Let p1 and p2 denote the vectors of probability mass assigned
by two distributions defined on the same state space; furthermore, let
these distributions be strictly positively-valued. Then, the Kullback-Leibler
divergence of p2 from p1 is given by sum(p1 * log(p1 / p2)).
Note that the terminology "divergence of p2 from p1" indicates
that p1 is the reference distribution against which the distribution
p2 is evaluated.
Kullback-Leibler divergence is not a symmetric function. That is, it is not
generally true that KullbackLeibler(p1, p2) = KullbackLeibler(p2, p1).
Symmetric functions based on the Kullback-Leibler divergence are available
through the Jeffrey and Topsoe distance.
The Kullback-Leibler divergence of p2 from p1.
Jensen-Shannon, Jeffrey, Topsoe
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.