entropy.plugin: Plug-In Entropy Estimator

Description Usage Arguments Details Value Author(s) See Also Examples

View source: R/entropy.plugin.R

Description

entropy.plugin computes the Shannon entropy H of a discrete random variable with the specified frequencies (probability mass function).

Usage

1
entropy.plugin(freqs, unit=c("log", "log2", "log10"))

Arguments

freqs

frequencies (probability mass function).

unit

the unit in which entropy is measured. The default is "nats" (natural units). For computing entropy in "bits" set unit="log2".

Details

The Shannon entropy of a discrete random variable is defined as H = -∑_k p(k) \log( p(k) ), where p is its probability mass function.

Value

entropy.plugin returns the Shannon entropy.

Author(s)

Korbinian Strimmer (https://strimmerlab.github.io).

See Also

entropy, entropy.empirical, entropy.shrink, mi.plugin, KL.plugin, discretize.

Examples

1
2
3
4
5
6
7
8
# load entropy library 
library("entropy")

# some frequencies
freqs = c(0.2, 0.1, 0.15, 0.05, 0, 0.3, 0.2)  

# and corresponding entropy
entropy.plugin(freqs)

Example output

[1] 1.66958

entropy documentation built on Oct. 3, 2021, 1:06 a.m.