kullback: Kullback-Leibler divergence

View source: R/test.R

kullbackR Documentation

Kullback-Leibler divergence

Description

It computes the Kullback-Leibler divergence between two mixtures of multidimensional ISR distributions.

Usage

kullback(proportion1, pi1, mu1, proportion2, pi2, mu2, m)

Arguments

proportion1, proportion2

vectors (which sums to 1) containing the K mixture proportions.

pi1, pi2

matrices of size K*p, where K is the number of clusters and p the number of dimension, containing the probabilities of a good comparison of the model (dispersion parameters).

mu1, mu2

matrices of size K*sum(m), containing the modal ranks. Each row contains the modal rank for a cluster. In the case of multivariate ranks, the reference rank for each dimension are set successively on the same row.

m

a vector containing the size of ranks for each dimension.

Value

the Kullback-Leibler divergence.

Author(s)

Quentin Grimonprez

References

http://en.wikipedia.org/wiki/Kullback

Examples

proportion1 <- c(0.4, 0.6)
pi1 <- matrix(c(0.8, 0.75), nrow = 2)
mu1 <- matrix(c(1, 2, 3, 4, 4, 2, 1, 3), nrow = 2, byrow = TRUE)

proportion2 <- c(0.43, 0.57)
pi2 <- matrix(c(0.82, 0.7), nrow = 2)
mu2 <- matrix(c(1, 2, 3, 4, 4, 2, 1, 3), nrow = 2, byrow = TRUE)

dK <- kullback(proportion1, pi1, mu1, proportion2, pi2, mu2, 4)


Rankcluster documentation built on Nov. 12, 2022, 9:05 a.m.