Description Usage Arguments Value Examples
View source: R/information.plugin.R
MI.plugin measures the mutual information between two random variables from the joint probability distribution table.
1 |
probs |
the joint probability distribution table of two random variables. |
unit |
the base of the logarithm. The default is natural logarithm, which is "log". For evaluating entropy in bits, it is suggested to set the unit to "log2". |
MI.plugin returns the mutual information.
1 2 3 4 5 6 7 8 9 10 11 12 13 | # two numeric vectors corresponding to two continuous random variables
x <- c(0.0, 0.2, 0.2, 0.7, 0.9, 0.9, 0.9, 0.9, 1.0)
y <- c(1.0, 2.0, 12, 8.0, 1.0, 9.0, 0.0, 3.0, 9.0)
# corresponding joint count table estimated by "uniform width" algorithm
count_xy <- discretize2D(x, y, "uniform_width")
# the joint probability distribution table of the count data
library("entropy")
probs_xy <- freqs.empirical(count_xy)
# corresponding mutual information
MI.plugin(probs_xy)
|
[1] 0.05044741
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.