information_gain: Calculate the Information Gain for a set split

Description Usage Arguments Value See Also

View source: R/script.R View source: R/script_BACKUP_1282.R View source: R/script_BACKUP_1990.R View source: R/script_BACKUP_2903.R View source: R/script_BACKUP_3193.R View source: R/script_BASE_1282.R View source: R/script_BASE_1990.R View source: R/script_BASE_2903.R View source: R/script_BASE_3193.R View source: R/script_LOCAL_1282.R View source: R/script_LOCAL_1990.R View source: R/script_LOCAL_2903.R View source: R/script_LOCAL_3193.R View source: R/script_REMOTE_1282.R View source: R/script_REMOTE_1990.R View source: R/script_REMOTE_2903.R View source: R/script_REMOTE_3193.R

Description

information_gain calculates the Information Gain resulting from the splitting of a set. This is the difference between the entropy of the entire set and the weighted entropies of the

Usage

1

Arguments

x

the target vector i.e. the entire set of labels (target variable values)

y

the splitting vector i.e. the vector of values for the splitting attribute

Value

The calculated Information Gain (IG)

See Also

entropy best_information_gain


teofanan/classprobtree documentation built on May 17, 2019, 5:53 p.m.