KLPQ: Computes and returns the Kullback-Leibler divergence between...

View source: R/KLPQ.R

KLPQR Documentation

Computes and returns the Kullback-Leibler divergence between two probability distributions

Description

Given two probability distributions (vectors) of the same length, this function computes the Kullback-Leibler divergence (relative entropy) between them. If any entries in the distribution are 0, then the argument JITTER can be used to add a tiny offset.

Usage

KLPQ(P = c(0.36, 0.48, 0.16), Q = c(0.333, 0.333, 0.333), JITTER = 0.0)

Arguments

P

A first probability distribution vector.

Q

A second probability distribution vector.

JITTER

An optional tiny value to be added to the probabilities to avoid non-zero entries (e.g., 0.000000001).

Value

The function returns a numeric value that is the Kullback-Leibler divergence. If this value is 0 (zero), then the two distributions are identical.

Note

Used to provide a uni-directional divergence measure. Note that P||Q does not necessarily equal Q||P.

Author(s)

Tarmo K. Remmel

References

None currently.

See Also

See Also patternbits

Examples

KLPQ(P = c(0.36, 0.48, 0.16), Q = c(0.333, 0.333, 0.333), JITTER = 0.0)

ShapePattern documentation built on Aug. 22, 2023, 9:13 a.m.