shannon_entropy: Calculate Shannon Entropy

Description Usage Arguments Value Examples

View source: R/shannon_entropy.R

Description

In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent in the variable's possible outcomes. The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication

Usage

1

Arguments

probs

numeric vector.

Value

numeric

Examples

1
2
p <- c(6 / 26, 20 / 26)
shannon_entropy(p)

Glender/gibberlite documentation built on Dec. 17, 2021, 10:21 p.m.