Description Usage Arguments Details Value Author(s) References See Also Examples
Calculate mutual information for two or three categorical variables.
1 2 3 |
x, y, z |
Vectors of class character or factor. |
smooth |
Additional cell counts for bayesian estimation of probabilities. |
log_base |
The base of the logarithmic function to be used. |
The mutual information for two variables is calculated by the formula
MI(x, y) = ∑ P(x, y) log(P(x, y) / (P(x)P(y)))
where the sum is over alle possible values of x and y.
The mutual information for three variables is calculated by the formula
MI(x, y, z) = ∑ P(x, y, z) log(P(x, y, z) / (P (x)P(y)P(z)))
where the sum is over all possible values of x, y and z.
The mutual information given by a single numeric value.
Katrine Kirkeby, enir_tak@hotmail.com
Maria Knudsen, mariaknudsen@hotmail.dk
Ninna Vihrs, ninnavihrs@hotmail.dk
TCJTtcherry \insertRefEKTStcherry
MIk
for mutual information for k variables.
1 2 3 4 5 6 7 8 9 10 11 12 | var1 <- c(sample(c(1, 2), 100, replace = TRUE))
var2 <- var1 + c(sample(c(1, 2), 100, replace = TRUE))
var3 <- c(sample(c(1, 2), 100, replace = TRUE))
var1 <- as.character(var1)
var2 <- as.character(var2)
var3 <- as.character(var3)
MI2(var1, var2, smooth = 1)
MI2(var1, var2, smooth = 0.1, log_base = exp(1))
MI3(var1, var2, var3, smooth = 1)
MI3(var1, var2, var3, smooth = 0.1, log_base = exp(1))
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.