KLPQ | R Documentation |
Given two probability distributions (vectors) of the same length, this function computes the Kullback-Leibler divergence (relative entropy) between them. If any entries in the distribution are 0, then the argument JITTER can be used to add a tiny offset.
KLPQ(P = c(0.36, 0.48, 0.16), Q = c(0.333, 0.333, 0.333), JITTER = 0.0)
P |
A first probability distribution vector. |
Q |
A second probability distribution vector. |
JITTER |
An optional tiny value to be added to the probabilities to avoid non-zero entries (e.g., 0.000000001). |
The function returns a numeric value that is the Kullback-Leibler divergence. If this value is 0 (zero), then the two distributions are identical.
Used to provide a uni-directional divergence measure. Note that P||Q does not necessarily equal Q||P.
Tarmo K. Remmel
None currently.
See Also patternbits
KLPQ(P = c(0.36, 0.48, 0.16), Q = c(0.333, 0.333, 0.333), JITTER = 0.0)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.