ja-thomas/autoxgboost: Automatic tuning and fitting of xgboost

Automatic tuning and fitting of 'xgboost'. Use early stopping to determine the optimal number of iterations and Bayesian optimization (from 'mlrMBO') for all further parameters. Tunes class weights and thresholds in classification. Categorical features are handled efficiently either by impact encoding or dummy encoding based on the number of factor levels.

Getting started

Package details

Maintainer
LicenseBSD_2_clause + file LICENSE
Version0.0.0.9000
Package repositoryView on GitHub
Installation Install the latest version of this package by entering the following in R:
install.packages("remotes")
remotes::install_github("ja-thomas/autoxgboost")
ja-thomas/autoxgboost documentation built on April 9, 2020, 11:10 p.m.