This function empirically estimates the Mutual Information from a table of counts using the observed frequencies.
a table of counts.
a character determining if the Mutual Information should be normalized.
The mutual information estimation is computed from the observed frequencies through a plugin estimator based on entropy.
The plugin estimator is I(X, Y) = H (X) + H(Y) - H(X, Y), where H() is the entropy computed with
Mutual information estimate.
Cover, Thomas M, and Joy A Thomas. (2012). "Elements of Information Theory". John Wiley & Sons.
1 2 3 4 5 6 7 8 9
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.