Description Usage Arguments Value
Train an eXtreme Gradient Boosting model [Alpha]
1 2 3 4 5 |
data |
The dataset to train the xgboost model on. Should be a list containing training, validation and test sets (if applicable). |
response |
The type of xgboost model to train - currently supports either 'r' (regression) or 'c' (classification). |
dataConfig |
A list containing the data preprocessing configuration settings. Refer to details for more information. |
paramConfig |
A list containing the bounds of the xgboost model to optimise the parameters for. |
nrounds |
The number of rounds to run xgboost for. This should be set to a higher number if eta is fairly low. |
earlyStopRatio |
The early stopping condition, where xgboost will stop training if it fails to find a better validation score for x of number of rounds. |
objective |
The objective for the xgboost model. Leave NULL for automatic determination. Default: NULL. |
eval_metric |
The evaluation metric for the xgboost model. Leave NULL for automatic determination. Default: NULL. |
onlyDataInd |
A logical containing whether or not to build the model or stop after data preprocessing. Can be used to derive the test dataset using the same transformation as the training dataset. Default: FALSE. |
opt.iter |
The number of Bayesian optimisation iterations to perform. Default: 10. |
opt.initialpts |
The initial number of points to initialise the Bayesian optimisation for. Default: 10. |
opt.savemodel |
A logical containing whether or not to save the models being trialed in the Bayesian optimisation process. Default: TRUE. |
train.proportion |
The splitting ratio for train/validation. Note that this is not stratified (future enhancement). |
A list containing two elements. Models which contain all the xgb boost models built as part of the hyperparameter optimisation. Results which contain the Bayesian optimisation output.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.