mutinformation: mutual information computation

View source: R/entropy.R

mutinformationR Documentation

mutual information computation

Description

mutinformation takes two random variables as input and computes the mutual information in nats according to the entropy estimator method. If Y is not supplied and X is a matrix-like argument, the function returns a matrix of mutual information between all pairs of variables in the dataset X.

Usage

mutinformation(X, Y, method="emp")

Arguments

X

vector/factor denoting a random variable or a data.frame denoting a random vector where columns contain variables/features and rows contain outcomes/samples.

Y

another random variable or random vector (vector/factor or data.frame).

method

The name of the entropy estimator. The package implements four estimators : "emp", "mm", "shrink", "sg" (default:"emp") - see details. These estimators require discrete data values - see discretize.

Details

  • "emp" : This estimator computes the entropy of the empirical probability distribution.

  • "mm" : This is the Miller-Madow asymptotic bias corrected empirical estimator.

  • "shrink" : This is a shrinkage estimate of the entropy of a Dirichlet probability distribution.

  • "sg" : This is the Schurmann-Grassberger estimate of the entropy of a Dirichlet probability distribution.

Value

mutinformation returns the mutual information I(X;Y) in nats.

Author(s)

Patrick E. Meyer

References

Meyer, P. E. (2008). Information-Theoretic Variable Selection and Network Inference from Microarray Data. PhD thesis of the Universite Libre de Bruxelles.

Cover, T. M. and Thomas, J. A. (1990). Elements of Information Theory. John Wiley, New York.

See Also

condinformation, multiinformation, interinformation, natstobits

Examples

  data(USArrests)
  dat<-discretize(USArrests)
  #computes the MIM (mutual information matrix)
  I <- mutinformation(dat,method= "emp")
  I2<- mutinformation(dat[,1],dat[,2])

infotheo documentation built on April 8, 2022, 5:08 p.m.