Description Usage Arguments Value References Examples
View source: R/information.plugin.R
II.plugin measures the amount information contained in a set of variables from the joint probability distribution table. The number of variables here is limited to three.
1 |
probs |
the joint probability distribution table of three random variables. |
unit |
the base of the logarithm. The default is natural logarithm, which is "log". For evaluating entropy in bits, it is suggested to set the unit to "log2". |
II.plugin returns the interaction information.
Mcgill, W. J. (1954). Multivariate information transmission. Psychometrika, 19(2), 97-116.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 | # three numeric vectors corresponding to three continuous random variables
x <- c(0.0, 0.2, 0.2, 0.7, 0.9, 0.9, 0.9, 0.9, 1.0)
y <- c(1.0, 2.0, 12, 8.0, 1.0, 9.0, 0.0, 3.0, 9.0)
z <- c(3.0, 7.0, 2.0, 11, 10, 10, 14, 2.0, 11)
# corresponding joint count table estimated by "uniform width" algorithm
count_xyz <- discretize3D(x, y, z, "uniform_width")
# the joint probability distribution table of the count data
library("entropy")
probs_xyz <- freqs.empirical(count_xyz)
# corresponding interaction information
II.plugin(probs_xyz)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.