CMI.plugin: A plug-in calculator for evaluating conditional mutual...

View source: R/information.plugin.R

CMI.pluginR Documentation

A plug-in calculator for evaluating conditional mutual information

Description

CMI.plugin measures the expected mutual information between two random variables conditioned on the third one from the joint probability distribution table.

Usage

CMI.plugin(probs, unit = c("log", "log2", "log10"))

Arguments

probs

the joint probability distribution table of three random variables.

unit

the base of the logarithm. The default is natural logarithm, which is "log". For evaluating entropy in bits, it is suggested to set the unit to "log2".

Value

CMI.plugin returns the conditional mutual information.

References

Wyner, A. D. (1978). A definition of conditional mutual information for arbitrary ensembles. Information & Computation, 38(1), 51-59.

Examples

# three numeric vectors corresponding to three continuous random variables
x <- c(0.0, 0.2, 0.2, 0.7, 0.9, 0.9, 0.9, 0.9, 1.0)
y <- c(1.0, 2.0,  12, 8.0, 1.0, 9.0, 0.0, 3.0, 9.0)
z <- c(3.0, 7.0, 2.0,  11,  10,  10,  14, 2.0,  11)

# corresponding joint count table estimated by "uniform width" algorithm
count_xyz <- discretize3D(x, y, z, "uniform_width")

# the joint probability distribution table of the count data
library("entropy")
probs_xyz <- freqs.empirical(count_xyz)

# corresponding conditional mutual information
CMI.plugin(probs_xyz)

chupan1218/Informeasure documentation built on Jan. 19, 2024, 5:30 p.m.