get_mutual_information: Calculate Mutual Information

View source: R/get_mutual_information.R

get_mutual_informationR Documentation

Calculate Mutual Information

Description

This function calculates the mutual information between two variables based on their joint distribution. Mutual information measures the amount of information obtained about one variable through the other.

Usage

get_mutual_information(table)

Arguments

table

A numeric matrix or table. A contingency table or frequency table of two variables.

Details

The mutual information is calculated using the formula: [ I(X, Y) = H(X) + H(Y) - H(X, Y) ] where:

  • \( H(X) \) is the entropy of variable X,

  • \( H(Y) \) is the entropy of variable Y, and

  • \( H(X, Y) \) is the joint entropy of X and Y.

Value

A numeric value representing the mutual information between the two variables.

Examples

# Example usage with a simple contingency table:
pair_table <- table(c(1, 2, 2, 3), c(1, 1, 2, 2))
get_mutual_information(pair_table)


covalchemy documentation built on April 12, 2025, 2:15 a.m.