learner.xgboost: Basic xgboost learner

Description Usage Arguments Details Value Examples

View source: R/learner.xgboost.R

Description

Parallel tuning function using xgboost that is either based on glmnet, boruta or the whole dataset.

Usage

1
2
learner.xgboost(data = data_train_numeric_clean_imputed, lasso = FALSE,
  boruta = FALSE)

Arguments

data

Input data which is default set to the numeric, imputed and cleaned training dataset

lasso

Boolean flag that shrinks data to the features of feature.lasso() stored in the variable features_lasso

boruta

Boolean flag that shrinks data to the features of boruta.lasso() stored in the variable features_boruta

Details

The default execution uses the whole training dataset. By setting either the lasso or boruta parameter to true the number of features is reduced according to the results of the feature selection. The number of rounds is set to 500 because including that parameter into the set of tuning parameters leads to worse results. The xgboost learner is wrapper in a Filter wrapper that uses chi squared as feature selection method. The result is three times cross validated at maximum 1000 experiments using irace as a control structure.

Value

0 as error output if both flags are set to true

Examples

1

MarcoNiemann/kaggle_house documentation built on May 7, 2019, 2:50 p.m.