kld | R Documentation |
Computes the Kullback-Leibler divergence between two random vectors distributed according to centered multivariate distributions:
multivariate generalized Gaussian distribution (MGGD) with zero mean vector, using the kldggd
function
multivariate Cauchy distribution (MCD) with zero location vector, using the kldcauchy
function
multivariate t
distribution (MTD) with zero mean vector, using the kldstudent
function
One can also use one of the kldggd
, kldcauchy
or kldstudent
functions, depending on the probability distribution.
kld(Sigma1, Sigma2, distribution = c("mggd", "mcd", "mtd"),
beta1 = NULL, beta2 = NULL, nu1 = NULL, nu2 = NULL, eps = 1e-06)
Sigma1 |
symmetric, positive-definite matrix. The scatter matrix of the first distribution. |
Sigma2 |
symmetric, positive-definite matrix. The scatter matrix of the second distribution. |
distribution |
the probability distribution. It can be |
beta1 , beta2 |
numeric. If |
nu1 , nu2 |
numeric. If |
eps |
numeric.
Precision for the computation of the Lauricella |
A numeric value: the Kullback-Leibler divergence between the two distributions,
with two attributes attr(, "epsilon")
(precision of the Lauricella D
-hypergeometric function or of its partial derivative)
and attr(, "k")
(number of iterations).
Pierre Santagostini, Nizar Bouhlel
N. Bouhlel, A. Dziri, Kullback-Leibler Divergence Between Multivariate Generalized Gaussian Distributions. IEEE Signal Processing Letters, vol. 26 no. 7, July 2019. \Sexpr[results=rd]{tools:::Rd_expr_doi("10.1109/LSP.2019.2915000")}
N. Bouhlel, D. Rousseau, A Generic Formula and Some Special Cases for the Kullback–Leibler Divergence between Central Multivariate Cauchy Distributions. Entropy, 24, 838, July 2022. \Sexpr[results=rd]{tools:::Rd_expr_doi("10.3390/e24060838")}
N. Bouhlel and D. Rousseau (2023), Exact Rényi and Kullback-Leibler Divergences Between Multivariate t-Distributions, IEEE Signal Processing Letters. \Sexpr[results=rd]{tools:::Rd_expr_doi("10.1109/LSP.2023.3324594")}
# Generalized Gaussian distributions
beta1 <- 0.74
beta2 <- 0.55
Sigma1 <- matrix(c(0.8, 0.3, 0.2, 0.3, 0.2, 0.1, 0.2, 0.1, 0.2), nrow = 3)
Sigma2 <- matrix(c(1, 0.3, 0.2, 0.3, 0.5, 0.1, 0.2, 0.1, 0.7), nrow = 3)
# Kullback-Leibler divergence
kl12 <- kld(Sigma1, Sigma2, "mggd", beta1 = beta1, beta2 = beta2)
kl21 <- kld(Sigma2, Sigma1, "mggd", beta1 = beta2, beta2 = beta1)
print(kl12)
print(kl21)
# Distance (symmetrized Kullback-Leibler divergence)
kldist <- as.numeric(kl12) + as.numeric(kl21)
print(kldist)
# Cauchy distributions
Sigma1 <- matrix(c(1, 0.6, 0.2, 0.6, 1, 0.3, 0.2, 0.3, 1), nrow = 3)
Sigma2 <- matrix(c(1, 0.3, 0.1, 0.3, 1, 0.4, 0.1, 0.4, 1), nrow = 3)
kld(Sigma1, Sigma2, "mcd")
kld(Sigma2, Sigma1, "mcd")
Sigma1 <- matrix(c(0.5, 0, 0, 0, 0.4, 0, 0, 0, 0.3), nrow = 3)
Sigma2 <- diag(1, 3)
# Case when all eigenvalues of Sigma1 %*% solve(Sigma2) are < 1
kld(Sigma1, Sigma2, "mcd")
# Case when all eigenvalues of Sigma1 %*% solve(Sigma2) are > 1
kld(Sigma2, Sigma1, "mcd")
# Student distributions
nu1 <- 2
Sigma1 <- matrix(c(2, 1.2, 0.4, 1.2, 2, 0.6, 0.4, 0.6, 2), nrow = 3)
nu2 <- 4
Sigma2 <- matrix(c(1, 0.3, 0.1, 0.3, 1, 0.4, 0.1, 0.4, 1), nrow = 3)
# Kullback-Leibler divergence
kld(Sigma1, Sigma2, "mtd", nu1 = nu1, nu2 = nu2)
kld(Sigma2, Sigma1, "mtd", nu1 = nu2, nu2 = nu1)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.