PMI.plugin: A plug-in calculator for evaluating the part mutual...

Description Usage Arguments Value References Examples

View source: R/information.plugin.R

Description

PMI.plug measures the non-linearly direct dependencies between two variables conditioned on the third one form the joint probability distribution table.

Usage

1
PMI.plugin(probs, unit = c("log", "log2", "log10"))

Arguments

probs

the joint probability distribution table of three random variables.

unit

the base of the logarithm. The default is natural logarithm, which is "log". For evaluating entropy in bits, it is suggested to set the unit to "log2".

Value

PMI.plugin returns the part mutual information.

References

Zhao, J., Zhou, Y., Zhang, X., & Chen, L. (2016). Part mutual information for quantifying direct associations in networks. Proceedings of the National Academy of Sciences of the United States of America, 113(18), 5130-5135.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
# three numeric vectors corresponding to three continuous random variables
x <- c(0.0, 0.2, 0.2, 0.7, 0.9, 0.9, 0.9, 0.9, 1.0)
y <- c(1.0, 2.0,  12, 8.0, 1.0, 9.0, 0.0, 3.0, 9.0)
z <- c(3.0, 7.0, 2.0,  11,  10,  10,  14, 2.0,  11)

# corresponding joint count table estimated by "uniform width" algorithm
count_xyz <- discretize3D(x, y, z, "uniform_width")

# the joint probability distribution table of the count data
library("entropy")
probs_xyz <- freqs.empirical(count_xyz)

# corresponding part mutual information
PMI.plugin(probs_xyz)

Informeasure documentation built on Nov. 8, 2020, 7:20 p.m.