condentropy: conditional entropy computation

View source: R/entropy.R

condentropyR Documentation

conditional entropy computation

Description

condentropy takes two random vectors, X and Y, as input and returns the conditional entropy, H(X|Y), in nats (base e), according to the entropy estimator method. If Y is not supplied the function returns the entropy of X - see entropy.

Usage

condentropy(X, Y=NULL, method="emp")

Arguments

X

data.frame denoting a random variable or random vector where columns contain variables/features and rows contain outcomes/samples.

Y

data.frame denoting a conditioning random variable or random vector where columns contain variables/features and rows contain outcomes/samples.

method

The name of the entropy estimator. The package implements four estimators : "emp", "mm", "shrink", "sg" (default:"emp") - see details. These estimators require discrete data values - see discretize.

Details

  • "emp" : This estimator computes the entropy of the empirical probability distribution.

  • "mm" : This is the Miller-Madow asymptotic bias corrected empirical estimator.

  • "shrink" : This is a shrinkage estimate of the entropy of a Dirichlet probability distribution.

  • "sg" : This is the Schurmann-Grassberger estimate of the entropy of a Dirichlet probability distribution.

Value

condentropy returns the conditional entropy, H(X|Y), of X given Y in nats.

Author(s)

Patrick E. Meyer

References

Meyer, P. E. (2008). Information-Theoretic Variable Selection and Network Inference from Microarray Data. PhD thesis of the Universite Libre de Bruxelles.

Cover, T. M. and Thomas, J. A. (1990). Elements of Information Theory. John Wiley, New York.

See Also

entropy, mutinformation, natstobits

Examples

  data(USArrests)
  dat<-discretize(USArrests)
  H <- condentropy(dat[,1], dat[,2], method = "mm")

infotheo documentation built on April 8, 2022, 5:08 p.m.