mi-data: Empirical Estimate of the Mutual Information from a Table of...

mi.dataR Documentation

Empirical Estimate of the Mutual Information from a Table of Counts

Description

This function estimates the mutual information from observed data

Usage

mi.data(X, Y, discretization.method=NULL, k=NULL)

Arguments

X

a data frame containing only numeric or continuous variables.

Y

a data frame containing only numeric or continuous variables.

discretization.method

a character vector giving the discretization method to use. See discretization.

k

in case of purely continuous dataset, the mutual information can be computed using the k-nearest neighbours.

Details

The mutual information estimation is computed from the observed frequencies through a plugin estimator based on entropy or using the estimator described in A. Kraskov, H. Stogbauer and P.Grassberger (2004) when the data frame is exclusively made of continuous variables.

The plugin estimator is I(X, Y) = H (X) + H(Y) - H(X, Y), where H() is the entropy computed with entropy.data.

Value

Mutual information estimate.

Author(s)

Gilles Kratzer

References

Kraskov, A., Stogbauer, H. and Grassberger, P. (2004) Estimating mutual information. Physical Review E, 69:066138, 1–16.

Examples

Y <- rnorm(n = 100, mean = 0, sd = 2)
X <- rnorm(n = 100, mean = 5, sd = 2)

mi.data(X = Y, Y = X, discretization.method = "sturges")

varrank documentation built on Oct. 12, 2022, 5:06 p.m.