KLD: Kullback-Leibler divergence(KLD)

Description Usage Arguments Value Examples

View source: R/KLD.R

Description

Calculate the Kullback-Leibler divergence between two probability distributions.

Usage

1
KLD(px, py, base = exp(1))

Arguments

px

discrete probability distributions

py

discrete probability distributions

base

the logarithmic base, defaults to e=exp(1)

Value

forward Kullback-Leibler divergence and backward Kullback-Leibler divergence of discrete probability distributions p and q

Examples

1
2
3
4
5
6
7
8
## Not run: 
p <- c(105,24,10,2,120,56)
q <- c(1,4,8,15,200,78)
KLD(p, q)
KLD(p, q, base = exp(1))
KLD(p, q, base = 2)

## End(Not run)

YimengSun21/StatComp21062 documentation built on Dec. 23, 2021, 10:18 p.m.