condinformation | R Documentation |
condinformation
takes three random variables as input and computes the
conditional mutual information in nats according to the entropy estimator method
.
If S is not supplied the function returns the mutual information between X and Y - see mutinformation
condinformation(X, Y, S=NULL, method="emp")
X |
vector/factor denoting a random variable or a data.frame denoting a random vector where columns contain variables/features and rows contain outcomes/samples. |
Y |
another random variable or random vector (vector/factor or data.frame). |
S |
the conditioning random variable or random vector (vector/factor or data.frame). |
method |
The name of the entropy estimator. The package implements four estimators :
"emp", "mm", "shrink", "sg" (default:"emp") - see details.
These estimators require discrete data values - see |
"emp" : This estimator computes the entropy of the empirical probability distribution.
"mm" : This is the Miller-Madow asymptotic bias corrected empirical estimator.
"shrink" : This is a shrinkage estimate of the entropy of a Dirichlet probability distribution.
"sg" : This is the Schurmann-Grassberger estimate of the entropy of a Dirichlet probability distribution.
condinformation
returns the conditional mutual information, I(X;Y|S), in nats.
Patrick E. Meyer
Meyer, P. E. (2008). Information-Theoretic Variable Selection and Network Inference from Microarray Data. PhD thesis of the Universite Libre de Bruxelles.
Cover, T. M. and Thomas, J. A. (1990). Elements of Information Theory. John Wiley, New York.
mutinformation
, multiinformation
, interinformation
, natstobits
data(USArrests) dat<-discretize(USArrests) I <- condinformation(dat[,1],dat[,2],dat[,3],method="emp")
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.