Description Usage Arguments Value Examples
View source: R/shannon_entropy.R
In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent in the variable's possible outcomes. The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication
1 | shannon_entropy(probs)
|
probs |
numeric vector. |
numeric
1 2 | p <- c(6 / 26, 20 / 26)
shannon_entropy(p)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.