Description Usage Arguments Details Value Author(s) References See Also Examples
View source: R/entropy.MillerMadow.R
entropy.MillerMadow
estimates the Shannon entropy H of the random variable Y
from the corresponding observed counts y
using the Miller-Madow correction
to the empirical entropy).
1 | entropy.MillerMadow(y, unit=c("log", "log2", "log10"))
|
y |
vector of counts. |
unit |
the unit in which entropy is measured.
The default is "nats" (natural units). For
computing entropy in "bits" set |
The Miller-Madow entropy estimator (1955) is the bias-corrected empirical entropy estimate.
Note that the Miller-Madow estimator is not a plug-in estimator, hence there are no explicit underlying bin frequencies.
entropy.MillerMadow
returns an estimate of the Shannon entropy.
Korbinian Strimmer (https://strimmerlab.github.io).
Miller, G. 1955. Note on the bias of information estimates. Info. Theory Psychol. Prob. Methods II-B:95-100.
1 2 3 4 5 6 7 8 9 10 11 | # load entropy library
library("entropy")
# observed counts for each bin
y = c(4, 2, 3, 0, 2, 4, 0, 0, 2, 1, 1)
# estimate entropy using Miller-Madow method
entropy.MillerMadow(y)
# compare to empirical estimate
entropy.empirical(y)
|
[1] 2.152593
[1] 1.968382
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.