mutinfo | R Documentation |
KNN Mutual Information Estimators.
mutinfo(X, Y, k=10, direct=TRUE)
X |
an input data matrix. |
Y |
an input data matrix. |
k |
the maximum number of nearest neighbors to search. The default value is set to 10. |
direct |
Directly compute or via entropies. |
The direct computation is based on the first estimator of A. Kraskov, H. Stogbauer and P.Grassberger (2004) and the indirect computation is done via entropy estimates, i.e., I(X, Y) = H (X) + H(Y) - H(X, Y). The direct method has smaller bias and variance but the indirect method is faster, see Evans (2008).
For direct method, one mutual information estimate;
For indirect method,a vector of length k
for mutual information estimates using 1:k
nearest neighbors, respectively.
Shengqiao Li. To report any bugs or suggestions please email: lishengqiao@yahoo.com
A. Kraskov, H. Stogbauer and P.Grassberger (2004). “Estimating mutual information”. Physical Review E, 69:066138, 1–16.
D. Evans (2008). “A Computationally efficient estimator for mutual information”. Proc. R. Soc. A, 464, 1203–1215.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.