entropy | R Documentation |
Computes the information entropy H=sum(p*log_b(p)), also known as Shannon entropy, of a probability vector p.
entropy(p, b = exp(1), normalize = TRUE)
p |
vector of probabilities; typically normalized, such that sum(p)=1. |
b |
base of the logarithm (default is e) |
normalize |
logical flag. If TRUE (default), the vector p is automatically normalized. |
Returns the information entropy in units that depend on b. If b=2, the units are bits; if b=exp(1), the units are nats; if b=10, the units are dits.
Danail Obreschkow
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.