information_gain: Estimating information gain between two categorical...

Description Usage Arguments Value Examples

View source: R/information_gain.R

Description

Information gain (also called mutual information) is a measure of the mutual dependence between two variables (see https://en.wikipedia.org/wiki/Mutual_information).

Usage

1
2
3
information_gain(x, y)

IG(x, y)

Arguments

x

A factor representing a categorical variable.

y

A factor representing a categorical variable.

Value

Information gain estimation based on Sannon entropy for variables x and y.

Examples

1
2
3
4
5
6
7
information_gain(factor(c(0,1)), factor(c(1,0)))
information_gain(factor(c(0,0,1,1)), factor(c(0,1,1,1)))
information_gain(factor(c(0,0,1,1)), factor(c(0,1,0,1)))
## Not run: 
information_gain(c(0,1), c(1,0))

## End(Not run)

msu documentation built on Sept. 30, 2017, 5:05 p.m.