entropy: Information entropy

View source: R/entropy.R

entropyR Documentation

Information entropy

Description

Computes the information entropy H=sum(p*log_b(p)), also known as Shannon entropy, of a probability vector p.

Usage

entropy(p, b = exp(1), normalize = TRUE)

Arguments

p

vector of probabilities; typically normalized, such that sum(p)=1.

b

base of the logarithm (default is e)

normalize

logical flag. If TRUE (default), the vector p is automatically normalized.

Value

Returns the information entropy in units that depend on b. If b=2, the units are bits; if b=exp(1), the units are nats; if b=10, the units are dits.

Author(s)

Danail Obreschkow


obreschkow/cooltools documentation built on Nov. 16, 2024, 2:46 a.m.