ja-thomas/autoxgboostMC: Automatic Preprocessing, Fitting and Tuning for xgboost using multiple criteria.

Automatic tuning and fitting of 'xgboost'. Use early stopping to determine the optimal number of iterations and Bayesian optimization (from 'mlrMBO') for all further parameters. Tunes class weights and thresholds in classification. Categorical features are handled efficiently either by impact encoding or dummy encoding based on the number of factor levels.

Getting started

Package details

LicenseBSD_2_clause + file LICENSE
Package repositoryView on GitHub
Installation Install the latest version of this package by entering the following in R:
ja-thomas/autoxgboostMC documentation built on May 17, 2019, 4:22 a.m.