View source: R/kl.divergence.R
kl.divergence | R Documentation |
Calculates the Kullback-Leibler divergence (relative entropy)
kl.divergence(object, eps = 10^-4, overlap = TRUE)
object |
Matrix or dataframe object with >=2 columns |
eps |
Probabilities below this threshold are replaced by this threshold for numerical stability. |
overlap |
Logical, do not determine the KL divergence for those pairs where for each point at least one of the densities has a value smaller than eps. |
Calculates the Kullback-Leibler divergence (relative entropy) between unweighted theoretical component distributions. Divergence is calculated as: int [f(x) (log f(x) - log g(x)) dx] for distributions with densities f() and g().
pairwise Kullback-Leibler divergence index (matrix)
Jeffrey S. Evans <jeffrey_evans@tnc.org>
Kullback S., and R. A. Leibler (1951) On information and sufficiency. The Annals of Mathematical Statistics 22(1):79-86
x <- seq(-3, 3, length=200)
y <- cbind(n=dnorm(x), t=dt(x, df=10))
matplot(x, y, type='l')
kl.divergence(y)
# extract value for last column
kl.divergence(y[,1:2])[3:3]
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.