interinformation: interaction information computation

Description Usage Arguments Details Value Author(s) References See Also Examples

Description

interinformation takes a dataset as input and computes the the interaction information among the random variables in the dataset using the entropy estimator method. This measure is also called synergy or complementarity.

Usage

1
interinformation(X, method="emp")

Arguments

X

data.frame denoting a random vector where columns contain variables/features and rows contain outcomes/samples.

method

The name of the entropy estimator. The package implements four estimators : "emp", "mm", "shrink", "sg" (default:"emp") - see details. These estimators require discrete data values - see discretize.

Details

Value

interinformation returns the interaction information (also called synergy or complementarity), in nats, among the random variables (columns of the data.frame).

Author(s)

Patrick E. Meyer

References

Meyer, P. E. (2008). Information-Theoretic Variable Selection and Network Inference from Microarray Data. PhD thesis of the Universite Libre de Bruxelles.

Jakulin, A. and Bratko, I. (2004). Testing the significance of attribute interactions. In Proc. of 21st International Conference on Machine Learning (ICML).

McGill, W. J. (1954). Multivariate information transmission. Psychometrika, 19.

See Also

condinformation, multiinformation, mutinformation, natstobits

Examples

1
2
3
  data(USArrests)
  dat<-discretize(USArrests)
  ii <- interinformation(dat, method = "sg")

Gibbsdavidl/perminfotheo documentation built on May 6, 2019, 6:29 p.m.