mi: kNN Mutual Information Estimators

Description Usage Arguments Details Value References Examples

View source: R/mi.R

Description

Estimate mutual information based on the distribution of nearest neighborhood distances. The kNN method is described by Kraskov, et. al (2004).

Usage

1
mi(x, y, k = 5, distance = FALSE)

Arguments

x

A numeric vector, matrix, data.frame or dist object.

y

A numeric vector, matrix, data.frame or dist object.

k

Order of neighborhood to be used in the kNN method.

distance

Bool flag for considering x and y as distance matrices or not. If distance = TRUE, x and y would be considered as distance matrices, otherwise, these arguments are treated as data and Euclidean distance would be implemented for the samples in x and y. Default: distance = FALSE.

Details

If two samples are passed to arguments x and y, the sample sizes (i.e. number of rows of the matrix or length of the vector) must agree. Moreover, data being passed to x and y must not contain missing or infinite values.

Value

mi

The estimated mutual information.

References

Kraskov, A., Stögbauer, H., & Grassberger, P. (2004). Estimating mutual information. Physical review E 69(6): 066138.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
library(fastmit)
set.seed(1)
x <- rnorm(100)
y <- x + rnorm(100)
mi(x, y, k = 5, distance = FALSE)

set.seed(1)
x <- rnorm(100)
y <- 100 * x + rnorm(100)
distx <- dist(x)
disty <- dist(y)
mi(distx, disty, k = 5, distance = TRUE)

fastmit documentation built on March 26, 2020, 8:50 p.m.