KL.z: KL.z

Description Usage Arguments Author(s) References Examples

View source: R/EntropEst.r

Description

Returns the Z estimator of Kullback-Leibler Divergence, which has exponentially decaying bias. See Zhang and Grabchak (2014b) for details.

Usage

1
KL.z(x, y)

Arguments

x

Vector of counts from the first distribution. Must be integer valued. Each entry represents the number of observations of a distinct letter.

y

Vector of counts from the second distribution. Must be integer valued. Each entry represents the number of observations of a distinct letter.

Author(s)

Lijuan Cao and Michael Grabchak

References

Z. Zhang and M. Grabchak (2014b). Nonparametric Estimation of Kullback-Leibler Divergence. Neural Computation, 26(11): 2570-2593.

Examples

1
2
3
4
 x = c(1,3,7,4,8) 
 y = c(2,5,1,3,6) 
 KL.z(x,y)  
 KL.z(y,x)  

EntropyEstimation documentation built on May 29, 2017, 9:08 a.m.

Search within the EntropyEstimation package
Search all R packages, documentation and source code