Installing the development version
splus
devtools::install_github("ja-thomas/autoxgboostMC")
autoxgboost aims to find an optimal xgboost model automatically using the machine learning framework mlr and the bayesian optimization framework mlrMBO.
Work in progress!
AutoxgboostMC embraces R6
for a cleaner design.
See the example code below for the new API.
# Instantiate the object with a list of measures to optimize.
axgb = AutoxgboostMC$new(pid.task, measures = list(auc, timepredict))
# Set hyperparameters
axgb$set_nthread(2L)
# Fit on a Task
axgb$fit(time.budget = 5L)
p = axgb$predict(pid.task)
The Automatic Gradient Boosting framework was presented at the ICML/IJCAI-ECAI 2018 AutoML Workshop (poster).
Please cite our ICML AutoML workshop paper on arxiv.
You can get citation info via citation("autoxgboost")
or copy the following BibTex entry:
@inproceedings{autoxgboost,
title={Automatic Gradient Boosting},
author={Thomas, Janek and Coors, Stefan and Bischl, Bernd},
booktitle={International Workshop on Automatic Machine Learning at ICML},
year={2018}
}
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.