Run an automatic bayesian optimisation of xgboost
1 2 3 |
dtrain |
The training data for the optimisation |
x |
A list that identifies the features |
y |
A string that identifies the label |
w |
A string that identifies the weight column. Defaults to NULL. |
base_margin |
A string that identifies the base_margin (offset). Defaults to NULL |
xgbParams |
A list of extra parameters to be passed to xgb.cv |
nrounds |
the max number of iterations |
early_stopping_rounds |
If set to an integer k, training with a validation set will stop if the performance doesn't improve for k rounds. |
nfold |
the original dataset is randomly partitioned into nfold equal size subsamples. |
folds |
list provides a possibility to use a list of pre-defined CV folds (each element must be a vector of test fold's indices). |
verbose |
Whether or not to print progress. |
seed |
Random State |
maximize |
Should the loss function be maximized or not? |
bounds |
A named list of lower and upper bounds for each hyperparameter. Please use "L" suffix to indicate integer hyperparameter. |
init_points |
Number of randomly chosen points to sample the target function before Bayesian Optimization fitting the Gaussian Process. |
n_iter |
Total number of times the Bayesian Optimization is to repeated. |
init_grid_dt |
User specified points to sample the target function, should be a data.frame or data.table with identical column names as bounds. |
... |
Additional args to be passed to BayesOptim |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.