View source: R/get_mutual_information.R
get_mutual_information | R Documentation |
This function calculates the mutual information between two variables based on their joint distribution. Mutual information measures the amount of information obtained about one variable through the other.
get_mutual_information(table)
table |
A numeric matrix or table. A contingency table or frequency table of two variables. |
The mutual information is calculated using the formula: [ I(X, Y) = H(X) + H(Y) - H(X, Y) ] where:
\( H(X) \) is the entropy of variable X,
\( H(Y) \) is the entropy of variable Y, and
\( H(X, Y) \) is the joint entropy of X and Y.
A numeric value representing the mutual information between the two variables.
# Example usage with a simple contingency table:
pair_table <- table(c(1, 2, 2, 3), c(1, 1, 2, 2))
get_mutual_information(pair_table)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.