EntropyEstimation-package: Estimation of Entropy and Related Quantities

EntropyEstimation-packageR Documentation

Estimation of Entropy and Related Quantities

Description

Contains methods for the estimation of Shannon's entropy, variants of Renyi's entropy, mutual Information, Kullback-Leibler divergence, and generalized Simpson's indices. These estimators have a bias that decays exponentially fast. For more information see Z. Zhang and J. Zhou (2010), Zhang (2012), Zhang (2013), Zhang and Grabchak (2013), Zhang and Grabchak (2014a), Zhang and Grabchak (2014b), and Zhang and Zheng (2014).

Details

Package: EntropyEstimation
Type: Package
Version: 1.2.1
Date: 2024-09-14
License: GPL3

Author(s)

Lijuan Cao <lcao2@charlotte.edu> and Michael Grabchak <mgrabcha@charlotte.edu>

References

Z. Zhang (2012). Entropy estimation in Turing's' perspective. Neural Computation 24(5), 1368–1389.

Z. Zhang (2013). Asymptotic normality of an entropy estimator with asymptotically decaying bias. IEEE Transactions on Information Theory 59(1), 504–508.

Z. Zhang and M. Grabchak (2013). Bias Adjustment for a Nonparametric Entropy Estimator. Entropy, 15(6), 1999-2011.

Z. Zhang and M. Grabchak (2014a). Entropic representation and estimation of diversity indices. http://arxiv.org/abs/1403.3031.

Z. Zhang and M. Grabchak (2014b). Nonparametric Estimation of Kullback-Leibler Divergence. Neural Computation, 26(11): 2570-2593.

Z. Zhang and L. Zheng (2014). A Mutual Information Estimator with Exponentially Decaying Bias.

Z. Zhang and J. Zhou (2010). Re-parameterization of multinomial distributions and diversity indices. Journal of Statistical Planning and Inference 140(7), 1731-1738.


EntropyEstimation documentation built on Sept. 30, 2024, 9:34 a.m.