Description Usage Arguments Value Author(s) References See Also Examples
View source: R/entropyFunctions.r
The mutual information between two high-dimensional mutivariate random variables is estimated from two (high-dimensional matrix) under a normality or k-NN distributional assumption.
1 |
Y |
(High-dimensional) matrix. Rows are assumed to represent the samples, and columns represent the samples' genes or traits. |
X |
(High-dimensional) matrix. Rows are assumed to represent the samples, and columns represent the samples' genes or traits. The number of rows of |
method |
Distributional assumption under which mutual information is to be estimated. |
k |
k-nearest neighbor parameter. |
center |
Logical indicator: should the columns (traits) of |
rescale |
Logical indicator: should |
The mutual information estimate is returned as a numeric
.
Wessel N. van Wieringen: w.vanwieringen@vumc.nl
Van Wieringen, W.N., Van der Vaart, A.W. (2011), "Statistical analysis of the cancer cell's molecular entropy using high-throughput data", Bioinformatics, 27(4), 556-563.
1 2 3 | data(pollackCN16)
data(pollackGE16)
hdMI(t(exprs(pollackGE16)), t(copynumber(pollackCN16)), method="knn")
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.