hdMI: Mutual information estimation.

Description Usage Arguments Value Author(s) References See Also Examples

View source: R/entropyFunctions.r

Description

The mutual information between two high-dimensional mutivariate random variables is estimated from two (high-dimensional matrix) under a normality or k-NN distributional assumption.

Usage

1
hdMI(Y, X, method = "normal", k = 1, center = TRUE, rescale = TRUE)

Arguments

Y

(High-dimensional) matrix. Rows are assumed to represent the samples, and columns represent the samples' genes or traits.

X

(High-dimensional) matrix. Rows are assumed to represent the samples, and columns represent the samples' genes or traits. The number of rows of X must be identical to that of Y.

method

Distributional assumption under which mutual information is to be estimated.

k

k-nearest neighbor parameter.

center

Logical indicator: should the columns (traits) of Y and X be centered at zero? Applied only under the normality assumption.

rescale

Logical indicator: should Y and X be rescaled to have the same scale? Applied only under the k-NN assumption.

Value

The mutual information estimate is returned as a numeric.

Author(s)

Wessel N. van Wieringen: w.vanwieringen@vumc.nl

References

Van Wieringen, W.N., Van der Vaart, A.W. (2011), "Statistical analysis of the cancer cell's molecular entropy using high-throughput data", Bioinformatics, 27(4), 556-563.

See Also

mutInfTest.

Examples

1
2
3
data(pollackCN16)
data(pollackGE16)
hdMI(t(exprs(pollackGE16)), t(copynumber(pollackCN16)), method="knn")

sigaR documentation built on April 28, 2020, 6:05 p.m.