mutualInformation | R Documentation |
Calculates the mutual information for a two-way table of observed counts or a joint probability distribution. The mutual information is a measure of association between two random variables.
mutualInformation(table)
table |
A two way table or probability distribution. Possibly
the output of the |
This is basically the Kullback-Leibler distance between the joint probability distribution and the probability distribution created by assuming the marginal distributions are independent. This is given in the following formula:
I[X;Y] = \sum_{x}\sum_{y} \Pr(X=x, Y=y) \log
\frac{\Pr(X=x, Y=y)}{\Pr(X=x)\Pr(Y=y)}
For square matrixes, the maximum mutual information, which should
should correspond to a diagnoal matrix, is log(d)
where
d
is the number of dimensions.
Russell Almond
https://en.wikipedia.org/wiki/Mutual_information
Shannon (1948) “A Mathematical Theory of Communication.”
table
, CPF
,
ewoe.CPF
, expTable
## UCBAdmissions is a three way table, so we need to
## make it a two way table.
mutualInformation(apply(UCBAdmissions,c(1,2),sum))
apply(UCBAdmissions,3,mutualInformation)
apply(UCBAdmissions,2,mutualInformation)
apply(UCBAdmissions,1,mutualInformation)
print(c(mutualInformation(matrix(1,2,2)),0))
print(c(mutualInformation(diag(2)),
mutualInformation(1-diag(2)), log(2)))
print(c(mutualInformation(diag(3)),log(3)))
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.