entropy.MillerMadow: Miller-Madow Entropy Estimator

Description Usage Arguments Details Value Author(s) References See Also Examples

View source: R/entropy.MillerMadow.R

Description

entropy.MillerMadow estimates the Shannon entropy H of the random variable Y from the corresponding observed counts y using the Miller-Madow correction to the empirical entropy).

Usage

1
entropy.MillerMadow(y, unit=c("log", "log2", "log10"))

Arguments

y

vector of counts.

unit

the unit in which entropy is measured. The default is "nats" (natural units). For computing entropy in "bits" set unit="log2".

Details

The Miller-Madow entropy estimator (1955) is the bias-corrected empirical entropy estimate.

Note that the Miller-Madow estimator is not a plug-in estimator, hence there are no explicit underlying bin frequencies.

Value

entropy.MillerMadow returns an estimate of the Shannon entropy.

Author(s)

Korbinian Strimmer (https://strimmerlab.github.io).

References

Miller, G. 1955. Note on the bias of information estimates. Info. Theory Psychol. Prob. Methods II-B:95-100.

See Also

entropy.empirical

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
# load entropy library 
library("entropy")

# observed counts for each bin
y = c(4, 2, 3, 0, 2, 4, 0, 0, 2, 1, 1)  

# estimate entropy using Miller-Madow method
entropy.MillerMadow(y)

# compare to empirical estimate
entropy.empirical(y)

Example output

[1] 2.152593
[1] 1.968382

entropy documentation built on Oct. 3, 2021, 1:06 a.m.