CE: Shannon's Conditional-Entropy H(X | Y)

View source: R/CE.R

CER Documentation

Shannon's Conditional-Entropy H(X | Y)

Description

Compute Shannon's Conditional-Entropy based on the chain rule H(X | Y) = H(X,Y) - H(Y) based on a given joint-probability vector P(X,Y) and probability vector P(Y).

Usage

CE(xy, y, unit = "log2")

Arguments

xy

a numeric joint-probability vector P(X,Y) for which Shannon's Joint-Entropy H(X,Y) shall be computed.

y

a numeric probability vector P(Y) for which Shannon's Entropy H(Y) (as part of the chain rule) shall be computed. It is important to note that this probability vector must be the probability distribution of random variable Y ( P(Y) for which H(Y) is computed).

unit

a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations.

Details

This function might be useful to fastly compute Shannon's Conditional-Entropy for any given joint-probability vector and probability vector.

Value

Shannon's Conditional-Entropy in bit.

Note

Note that the probability vector P(Y) must be the probability distribution of random variable Y ( P(Y) for which H(Y) is computed ) and furthermore used for the chain rule computation of H(X | Y) = H(X,Y) - H(Y).

Author(s)

Hajk-Georg Drost

References

Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.

See Also

H, JE

Examples

 
 CE(1:10/sum(1:10),1:10/sum(1:10))


philentropy documentation built on Nov. 10, 2022, 6:18 p.m.