KLD: Kullback-Leibler Divergence (KLD)

Description Usage Arguments Details Value Author(s) References See Also Examples

View source: R/KLD.R

Description

This function calculates the Kullback-Leibler divergence (KLD) between two probability distributions, and has many uses, such as in lowest posterior loss probability intervals, posterior predictive checks, prior elicitation, reference priors, and Variational Bayes.

Usage

1
KLD(px, py, base)

Arguments

px

This is a required vector of probability densities, considered as p(x). Log-densities are also accepted, in which case both px and py must be log-densities.

py

This is a required vector of probability densities, considered as p(y). Log-densities are also accepted, in which case both px and py must be log-densities.

base

This optional argument specifies the logarithmic base, which defaults to base=exp(1) (or e) and represents information in natural units (nats), where base=2 represents information in binary units (bits).

Details

The Kullback-Leibler divergence (KLD) is known by many names, some of which are Kullback-Leibler distance, K-L, and logarithmic divergence. KLD is an asymmetric measure of the difference, distance, or direct divergence between two probability distributions p(y) and p(x) (Kullback and Leibler, 1951). Mathematically, however, KLD is not a distance, because of its asymmetry.

Here, p(y) represents the “true” distribution of data, observations, or theoretical distribution, and p(x) represents a theory, model, or approximation of p(y).

For probability distributions p(y) and p(x) that are discrete (whether the underlying distribution is continuous or discrete, the observations themselves are always discrete, such as from i=1,...,N),

KLD[p(y)||p(x)] = sum of p(y[i]) log(p(y[i]) / p(x[i]))

In Bayesian inference, KLD can be used as a measure of the information gain in moving from a prior distribution, p(theta), to a posterior distribution, p(theta | y). As such, KLD is the basis of reference priors and lowest posterior loss intervals (LPL.interval), such as in Berger, Bernardo, and Sun (2009) and Bernardo (2005). The intrinsic discrepancy was introduced by Bernardo and Rueda (2002). For more information on the intrinsic discrepancy, see LPL.interval.

Value

KLD returns a list with the following components:

KLD.px.py

This is KLD[i](p(x[i]) || p(y[i])).

KLD.py.px

This is KLD[i](p(y[i]) || p(x[i])).

mean.KLD

This is the mean of the two components above. This is the expected posterior loss in LPL.interval.

sum.KLD.px.py

This is KLD(p(x) || p(y)). This is a directed divergence.

sum.KLD.py.px

This is KLD(p(y) || p(x)). This is a directed divergence.

mean.sum.KLD

This is the mean of the two components above.

intrinsic.discrepancy

This is minimum of the two directed divergences.

Author(s)

Statisticat, LLC. software@bayesian-inference.com

References

Berger, J.O., Bernardo, J.M., and Sun, D. (2009). "The Formal Definition of Reference Priors". The Annals of Statistics, 37(2), p. 905–938.

Bernardo, J.M. and Rueda, R. (2002). "Bayesian Hypothesis Testing: A Reference Approach". International Statistical Review, 70, p. 351–372.

Bernardo, J.M. (2005). "Intrinsic Credible Regions: An Objective Bayesian Approach to Interval Estimation". Sociedad de Estadistica e Investigacion Operativa, 14(2), p. 317–384.

Kullback, S. and Leibler, R.A. (1951). "On Information and Sufficiency". The Annals of Mathematical Statistics, 22(1), p. 79–86.

See Also

LPL.interval and VariationalBayes.

Examples

1
2
3
4
library(LaplacesDemon)
px <- dnorm(runif(100),0,1)
py <- dnorm(runif(100),0.1,0.9)
KLD(px,py)

Example output

$KLD.px.py
  [1]  1.173079e-06  2.667700e-03 -7.189659e-04 -9.808287e-04  1.218113e-03
  [6]  2.745839e-04 -2.351353e-03 -2.498095e-03 -8.363308e-05  1.945838e-04
 [11]  4.729535e-04  2.813940e-03  2.123236e-03 -1.225901e-03  5.492767e-04
 [16] -1.843822e-03  1.233274e-03 -1.863111e-03 -2.107463e-03  9.980740e-04
 [21]  2.707725e-03 -1.950553e-03  3.961373e-03 -1.741292e-03  1.834610e-03
 [26]  2.414998e-03  2.541411e-03  3.677960e-03 -1.732709e-03  5.333363e-03
 [31]  2.923150e-03  2.197267e-03  2.238618e-04 -3.131543e-03 -2.404365e-03
 [36]  1.910454e-04 -2.422138e-03 -3.209586e-03  1.144621e-03  4.914130e-03
 [41] -2.642636e-03 -1.555469e-03  2.723377e-03  3.609389e-04 -1.150526e-03
 [46] -1.570546e-03  1.361625e-03  3.655326e-03 -8.727801e-04 -3.099100e-03
 [51]  3.568574e-04  4.821919e-03 -1.762866e-03 -2.285807e-03 -1.563409e-03
 [56]  2.962205e-04  4.462611e-03  5.815962e-04 -6.304698e-04 -2.222763e-03
 [61] -2.286767e-03 -1.119776e-03  3.551249e-03  1.207850e-03  1.548341e-04
 [66] -1.491419e-03 -2.588657e-03  2.352858e-04  5.869114e-05  1.753953e-03
 [71] -5.621959e-04  7.000351e-04 -1.534569e-03 -2.166771e-03  1.342009e-04
 [76]  5.252146e-04  5.574687e-03 -2.598276e-03 -7.456120e-04  1.657037e-03
 [81]  1.829339e-03 -2.574500e-03 -1.311804e-03 -2.788234e-03  1.212273e-04
 [86]  2.267438e-03 -1.616358e-03  3.089140e-03  3.286109e-03  3.425525e-03
 [91] -5.902490e-04  2.541719e-04 -4.734020e-04  1.984498e-03  4.367985e-04
 [96]  1.193753e-03 -6.486365e-04  3.517265e-03 -3.258833e-03  1.581841e-04

$KLD.py.px
  [1] -1.172959e-06 -2.004489e-03  7.707837e-04  1.077571e-03 -1.078595e-03
  [6] -2.682640e-04  3.072008e-03  3.370960e-03  8.424691e-05 -1.913706e-04
 [11] -4.541421e-04 -2.213720e-03 -1.773134e-03  1.426482e-03 -5.241920e-04
 [16]  2.332447e-03 -1.089005e-03  2.317169e-03  2.696153e-03 -8.902025e-04
 [21] -2.095931e-03  2.402947e-03 -2.775090e-03  2.191378e-03 -1.497456e-03
 [26] -1.942428e-03 -2.029485e-03 -2.685602e-03  2.132266e-03 -3.381553e-03
 [31] -2.280856e-03 -1.823291e-03 -2.196247e-04  4.731356e-03  3.165031e-03
 [36] -1.879444e-04  3.202178e-03  4.956637e-03 -1.038445e-03 -3.145541e-03
 [41]  3.651957e-03  1.823781e-03 -2.102210e-03 -3.482701e-04  1.286620e-03
 [46]  1.916475e-03 -1.203981e-03 -2.661452e-03  9.568161e-04  4.642454e-03
 [51] -3.462189e-04 -3.195837e-03  2.186827e-03  2.955265e-03  1.942985e-03
 [56] -2.888298e-04 -3.011846e-03 -5.534799e-04  6.681713e-04  3.018480e-03
 [61]  3.013677e-03  1.255623e-03 -2.608986e-03 -1.043144e-03 -1.527924e-04
 [66]  1.796168e-03  3.584158e-03 -2.306147e-04 -5.839436e-05 -1.454767e-03
 [71]  6.031333e-04 -6.590780e-04  1.792960e-03  2.752760e-03 -1.326263e-04
 [76] -5.022037e-04 -3.474565e-03  3.671894e-03  7.992694e-04 -1.368868e-03
 [81] -1.516088e-03  3.502139e-03  1.563483e-03  3.918444e-03 -1.196669e-04
 [86] -1.869099e-03  2.017437e-03 -2.301432e-03 -2.478183e-03 -2.560229e-03
 [91]  6.249310e-04 -2.477251e-04  4.941480e-04 -1.632565e-03 -4.201019e-04
 [96] -1.078732e-03  6.994736e-04 -2.477264e-03  5.052743e-03 -1.560415e-04

$mean.KLD
  [1] 5.979303e-11 3.316057e-04 2.590889e-05 4.837090e-05 6.975912e-05
  [6] 3.159944e-06 3.603277e-04 4.364327e-04 3.069135e-07 1.606606e-06
 [11] 9.405706e-06 3.001102e-04 1.750514e-04 1.002905e-04 1.254235e-05
 [16] 2.443124e-04 7.213445e-05 2.270288e-04 2.943453e-04 5.393576e-05
 [21] 3.058973e-04 2.261968e-04 5.931413e-04 2.250430e-04 1.685768e-04
 [26] 2.362853e-04 2.559631e-04 4.961788e-04 1.997785e-04 9.759049e-04
 [31] 3.211472e-04 1.869876e-04 2.118584e-06 7.999064e-04 3.803329e-04
 [36] 1.550503e-06 3.900199e-04 8.735254e-04 5.308770e-05 8.842943e-04
 [41] 5.046606e-04 1.341561e-04 3.105835e-04 6.334381e-06 6.804688e-05
 [46] 1.729644e-04 7.882159e-05 4.969369e-04 4.201802e-05 7.716770e-04
 [51] 5.319263e-06 8.130412e-04 2.119802e-04 3.347290e-04 1.897876e-04
 [56] 3.695349e-06 7.253822e-04 1.405813e-05 1.885074e-05 3.978586e-04
 [61] 3.634549e-04 6.792340e-05 4.711315e-04 8.235305e-05 1.020840e-06
 [66] 1.523748e-04 4.977504e-04 2.335581e-06 1.483859e-07 1.495931e-04
 [71] 2.046870e-05 2.047857e-05 1.291954e-04 2.929945e-04 7.872955e-07
 [76] 1.150543e-05 1.050061e-03 5.368091e-04 2.682868e-05 1.440845e-04
 [81] 1.566252e-04 4.638194e-04 1.258395e-04 5.651046e-04 7.801914e-07
 [86] 1.991691e-04 2.005395e-04 3.938541e-04 4.039629e-04 4.326482e-04
 [91] 1.734101e-05 3.223392e-06 1.037298e-05 1.759663e-04 8.348291e-06
 [96] 5.751007e-05 2.541857e-05 5.200007e-04 8.969550e-04 1.071295e-06

$sum.KLD.px.py
[1] 0.02437159

$sum.KLD.py.px
[1] 0.02427506

$mean.sum.KLD
[1] 0.02432332

$intrinsic.discrepancy
[1] 0.02427506

LaplacesDemon documentation built on July 9, 2021, 5:07 p.m.