README.md

autoxgboostMC - Multiple-Criteria tuning and fitting of xgboost models.

Build Status Coverage Status

General overview

autoxgboost aims to find an optimal xgboost model automatically using the machine learning framework mlr and the bayesian optimization framework mlrMBO.

Work in progress!

AutoxgboostMC embraces R6 for a cleaner design. See the example code below for the new API.

# Instantiate the object with a list of measures to optimize.
axgb = AutoxgboostMC$new(pid.task, measures = list(auc, timepredict))
# Set hyperparameters
axgb$set_nthread(2L)
# Fit on a Task
axgb$fit(time.budget = 5L)
p = axgb$predict(pid.task)

autoxgboost - How to Cite

The Automatic Gradient Boosting framework was presented at the ICML/IJCAI-ECAI 2018 AutoML Workshop (poster). Please cite our ICML AutoML workshop paper on arxiv. You can get citation info via citation("autoxgboost") or copy the following BibTex entry:

@inproceedings{autoxgboost,
  title={Automatic Gradient Boosting},
  author={Thomas, Janek and Coors, Stefan and Bischl, Bernd},
  booktitle={International Workshop on Automatic Machine Learning at ICML},
  year={2018}
}


ja-thomas/autoxgboostMC documentation built on May 17, 2019, 4:22 a.m.