kldggd: Kullback-Leibler Divergence between Centered Multivariate...

View source: R/kldggd.R

kldggdR Documentation

Kullback-Leibler Divergence between Centered Multivariate generalized Gaussian Distributions

Description

Computes the Kullback- Leibler divergence between two random variables distributed according to multivariate generalized Gaussian distributions (MGGD) with zero means.

Usage

kldggd(Sigma1, beta1, Sigma2, beta2, eps = 1e-06)

Arguments

Sigma1

symmetric, positive-definite matrix. The dispersion matrix of the first distribution.

beta1

positive real number. The shape parameter of the first distribution.

Sigma2

symmetric, positive-definite matrix. The dispersion matrix of the second distribution.

beta2

positive real number. The shape parameter of the second distribution.

eps

numeric. Precision for the computation of the Lauricella function (see lauricella). Default: 1e-06.

Details

Given \mathbf{X}_1, a random vector of \mathbb{R}^p (p > 1) distributed according to the MGGD with parameters (\mathbf{0}, \Sigma_1, \beta_1) and \mathbf{X}_2, a random vector of \mathbb{R}^p distributed according to the MGGD with parameters (\mathbf{0}, \Sigma_2, \beta_2).

The Kullback-Leibler divergence between X_1 and X_2 is given by:

\displaystyle{ KL(\mathbf{X}_1||\mathbf{X}_2) = \ln{\left(\frac{\beta_1 |\Sigma_1|^{-1/2} \Gamma\left(\frac{p}{2\beta_2}\right)}{\beta_2 |\Sigma_2|^{-1/2} \Gamma\left(\frac{p}{2\beta_1}\right)}\right)} + \frac{p}{2} \left(\frac{1}{\beta_2} - \frac{1}{\beta_1}\right) \ln{2} - \frac{p}{2\beta_2} + 2^{\frac{\beta_2}{\beta_1}-1} \frac{\Gamma{\left(\frac{\beta_2}{\beta_1} + \frac{p}{\beta_1}\right)}}{\Gamma{\left(\frac{p}{2 \beta_1}\right)}} \lambda_p^{\beta_2} }

\displaystyle{ \times F_D^{(p-1)}\left(-\beta_1; \underbrace{\frac{1}{2},\dots,\frac{1}{2}}_{p-1}; \frac{p}{2}; 1-\frac{\lambda_{p-1}}{\lambda_p},\dots,1-\frac{\lambda_{1}}{\lambda_p}\right) }

where \lambda_1 < ... < \lambda_{p-1} < \lambda_p are the eigenvalues of the matrix \Sigma_1 \Sigma_2^{-1}
and F_D^{(p-1)} is the Lauricella D-hypergeometric Function.

This computation uses the lauricella function.

When p = 1 (univariate case): let X_1, a random variable distributed according to the generalized Gaussian distribution with parameters (0, \sigma_1, \beta_1) and X_2, a random variable distributed according to the generalized Gaussian distribution with parameters (0, \sigma_2, \beta_2).

KL(X_1||X_2) = \displaystyle{ \ln{\left(\frac{\frac{\beta_1}{\sqrt{\sigma_1}} \Gamma\left(\frac{1}{2\beta_2}\right)}{\frac{\beta_2}{\sqrt{\sigma_2}} \Gamma\left(\frac{1}{2\beta_1}\right)}\right)} + \frac{1}{2} \left(\frac{1}{\beta_2} - \frac{1}{\beta_1}\right) \ln{2} - \frac{1}{2\beta_2} + 2^{\frac{\beta_2}{\beta_1}-1} \frac{\Gamma{\left(\frac{\beta_2}{\beta_1} + \frac{1}{\beta_1}\right)}}{\Gamma{\left(\frac{1}{2 \beta_1}\right)}} \left(\frac{\sigma_1}{\sigma_2}\right)^{\beta_2} }

Value

A numeric value: the Kullback-Leibler divergence between the two distributions, with two attributes attr(, "epsilon") (precision of the result of the Lauricella function; 0 if the distributions are univariate) and attr(, "k") (number of iterations).

Author(s)

Pierre Santagostini, Nizar Bouhlel

References

N. Bouhlel, A. Dziri, Kullback-Leibler Divergence Between Multivariate Generalized Gaussian Distributions. IEEE Signal Processing Letters, vol. 26 no. 7, July 2019. \Sexpr[results=rd]{tools:::Rd_expr_doi("10.1109/LSP.2019.2915000")}

See Also

mvdggd: probability density of a MGGD.

Examples

beta1 <- 0.74
beta2 <- 0.55
Sigma1 <- matrix(c(0.8, 0.3, 0.2, 0.3, 0.2, 0.1, 0.2, 0.1, 0.2), nrow = 3)
Sigma2 <- matrix(c(1, 0.3, 0.2, 0.3, 0.5, 0.1, 0.2, 0.1, 0.7), nrow = 3)

# Kullback-Leibler divergence
kl12 <- kldggd(Sigma1, beta1, Sigma2, beta2)
kl21 <- kldggd(Sigma2, beta2, Sigma1, beta1)
print(kl12)
print(kl21)

# Distance (symmetrized Kullback-Leibler divergence)
kldist <- as.numeric(kl12) + as.numeric(kl21)
print(kldist)


mggd documentation built on March 31, 2023, 9:56 p.m.