mutual: Mutual information of two random variables

View source: R/mutual.R

mutualR Documentation

Mutual information of two random variables

Description

Computes the mutual information of two random variables X and Y, given their 2D density represented in a matrix.

Usage

mutual(x, y = NULL, b = exp(1), n = NULL, xlim = NULL, ylim = NULL)

Arguments

x

either of the following: (1) an m-by-n matrix representing the 2D probability mass function of two random variables X and Y; all elements must be non-negative; the normalization is irrelevant. (2) an n-vector of sampled x-values; in this case y must be specified.

y

optional vector of sampled y-values (only used if x is a vector of x-values).

b

base of the logarithm in mutual information I(X,Y). Default is e.

n

scalar or 2-element vector specifying the number of equally space grid cells. Only used if x and y are vectors. If not provided, the default is n=0.2*sqrt(length(x)), bound between 2 and 1000. Note that n~sqrt(length(x)) keeps the mutual information constant for random data sets of different size.

xlim

2-element vector specifying the x-range (data cropped if necessary). Only used if x and y are vectors. If not given, xlim is set to the range of x.

ylim

2-element vector specifying the y-range (data cropped if necessary). Only used if x and y are vectors. If not given, ylim is set to the range of y.

Value

Returns a list of items:

I

standard mutual information I(X,Y).

N

normalized mutual information I(X,Y)/sqrt(H(X)*H(Y)), where H is the Shannon information entropy.

Author(s)

Danail Obreschkow


obreschkow/cooltools documentation built on Nov. 16, 2024, 2:46 a.m.