condinformation: conditional mutual information computation

Description Usage Arguments Details Value Author(s) References See Also Examples

Description

condinformation takes three random variables as input and computes the conditional mutual information in nats according to the entropy estimator method. If S is not supplied the function returns the mutual information between X and Y - see mutinformation

Usage

1
condinformation(X, Y, S=NULL, method="emp")

Arguments

X

vector/factor denoting a random variable or a data.frame denoting a random vector where columns contain variables/features and rows contain outcomes/samples.

Y

another random variable or random vector (vector/factor or data.frame).

S

the conditioning random variable or random vector (vector/factor or data.frame).

method

The name of the entropy estimator. The package implements four estimators : "emp", "mm", "shrink", "sg" (default:"emp") - see details. These estimators require discrete data values - see discretize.

Details

Value

condinformation returns the conditional mutual information, I(X;Y|S), in nats.

Author(s)

Patrick E. Meyer

References

Meyer, P. E. (2008). Information-Theoretic Variable Selection and Network Inference from Microarray Data. PhD thesis of the Universite Libre de Bruxelles.

Cover, T. M. and Thomas, J. A. (1990). Elements of Information Theory. John Wiley, New York.

See Also

mutinformation, multiinformation, interinformation, natstobits

Examples

1
2
3
  data(USArrests)
  dat<-discretize(USArrests)
  I <- condinformation(dat[,1],dat[,2],dat[,3],method="emp")

Gibbsdavidl/perminfotheo documentation built on May 6, 2019, 6:29 p.m.