kl.divergence: Kullback-Leibler divergence (relative entropy)

View source: R/kl.divergence.R

kl.divergenceR Documentation

Kullback-Leibler divergence (relative entropy)

Description

Calculates the Kullback-Leibler divergence (relative entropy)

Usage

kl.divergence(object, eps = 10^-4, overlap = TRUE)

Arguments

object

Matrix or dataframe object with >=2 columns

eps

Probabilities below this threshold are replaced by this threshold for numerical stability.

overlap

Logical, do not determine the KL divergence for those pairs where for each point at least one of the densities has a value smaller than eps.

Details

Calculates the Kullback-Leibler divergence (relative entropy) between unweighted theoretical component distributions. Divergence is calculated as: int [f(x) (log f(x) - log g(x)) dx] for distributions with densities f() and g().

Value

pairwise Kullback-Leibler divergence index (matrix)

Author(s)

Jeffrey S. Evans <jeffrey_evans@tnc.org>

References

Kullback S., and R. A. Leibler (1951) On information and sufficiency. The Annals of Mathematical Statistics 22(1):79-86

Examples

x <- seq(-3, 3, length=200)
y <- cbind(n=dnorm(x), t=dt(x, df=10))
  matplot(x, y, type='l')
    kl.divergence(y)
   
# extract value for last column
  kl.divergence(y[,1:2])[3:3]


spatialEco documentation built on Nov. 18, 2023, 1:13 a.m.