SymKL.sd: SymKL.sd

Description Usage Arguments Author(s) References Examples

View source: R/EntropEst.r

Description

Returns the estimated asymptotic standard deviation for the Z estimator of Symmetrized Kullback-Leibler's divergence. Note that this is also the asymptotic standard deviation of the plug-in estimator. See Zhang and Grabchak (2014b) for details.

Usage

1
SymKL.sd(x, y)

Arguments

x

Vector of counts from first distribution. Must be integer valued. Each entry represents the number of observations of a distinct letter.

y

Vector of counts from second distribution. Must be integer valued. Each entry represents the number of observations of a distinct letter.

Author(s)

Lijuan Cao and Michael Grabchak

References

Z. Zhang and M. Grabchak (2014b). Nonparametric Estimation of Kullback-Leibler Divergence. Neural Computation, DOI 10.1162/NECO_a_00646.

Examples

1
2
3
 x = c(1,3,7,4,8) # first vector of counts
 y = c(2,5,1,3,6) # second vector of counts
 SymKL.sd(x,y)  # Estimated standard deviation

EntropyEstimation documentation built on May 1, 2019, 6:33 p.m.