KL.plugin: Plug-In Estimator of the Kullback-Leibler divergence and of...

Description Usage Arguments Details Value Author(s) See Also Examples

View source: R/KL.plugin.R

Description

KL.plugin computes the Kullback-Leiber (KL) divergence between two discrete random variables x_1 and x_2. The corresponding probability mass functions are given by freqs1 and freqs2. Note that the expectation is taken with regard to x_1 using freqs1.

chi2.plugin computes the chi-squared divergence between two discrete random variables x_1 and x_2 with freqs1 and freqs2 as corresponding probability mass functions. Note that the denominator contains freqs2.

Usage

1
2
KL.plugin(freqs1, freqs2, unit=c("log", "log2", "log10"))
chi2.plugin(freqs1, freqs2, unit=c("log", "log2", "log10"))

Arguments

freqs1

frequencies (probability mass function) for variable x_1.

freqs2

frequencies (probability mass function) for variable x_2.

unit

the unit in which entropy is measured. The default is "nats" (natural units). For computing entropy in "bits" set unit="log2".

Details

Kullback-Leibler divergence between the two discrete variables x_1 to x_2 is ∑_k p_1(k) \log (p_1(k)/p_2(k)) where p_1 and p_2 are the probability mass functions of x_1 and x_2, respectively, and k is the index for the classes.

The chi-squared divergence is given by ∑_k (p_1(k)-p_2(k))^2/p_2(k) .

Note that both the KL divergence and the chi-squared divergence are not symmetric in x_1 and x_2. The chi-squared divergence can be derived as a quadratic approximation of twice the KL divergence.

Value

KL.plugin returns the KL divergence.

chi2.plugin returns the chi-squared divergence.

Author(s)

Korbinian Strimmer (https://strimmerlab.github.io).

See Also

KL.Dirichlet, KL.shrink, KL.empirical, mi.plugin, discretize2d.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
# load entropy library 
library("entropy")

# probabilities for two random variables
freqs1 = c(1/5, 1/5, 3/5)
freqs2 = c(1/10, 4/10, 1/2) 

# KL divergence between x1 to x2
KL.plugin(freqs1, freqs2)

# and corresponding (half) chi-squared divergence
0.5*chi2.plugin(freqs1, freqs2)

## relationship to Pearson chi-squared statistic

# Pearson chi-squared statistic and p-value
n = 30 # sample size (observed counts)
chisq.test(n*freqs1, p = freqs2) # built-in function

# Pearson chi-squared statistic from Pearson divergence
pcs.stat = n*chi2.plugin(freqs1, freqs2) # note factor n
pcs.stat

# and p-value
df = length(freqs1)-1 # degrees of freedom
pcs.pval = 1-pchisq(pcs.stat, df)
pcs.pval

entropy documentation built on Oct. 3, 2021, 1:06 a.m.

Related to KL.plugin in entropy...