xgb_autotune: xgb_autotune

Description Usage Arguments

View source: R/xgb_autotune.R

Description

Run an automatic bayesian optimisation of xgboost

Usage

1
2
3
xgb_autotune(dtrain, x, y, w = NULL, base_margin = NULL, xgbParams, nrounds,
  early_stopping_rounds, nfold, folds = NULL, verbose = TRUE, seed = 1921,
  maximize = FALSE, bounds, init_points, n_iter, init_grid_dt = NULL, ...)

Arguments

dtrain

The training data for the optimisation

x

A list that identifies the features

y

A string that identifies the label

w

A string that identifies the weight column. Defaults to NULL.

base_margin

A string that identifies the base_margin (offset). Defaults to NULL

xgbParams

A list of extra parameters to be passed to xgb.cv

nrounds

the max number of iterations

early_stopping_rounds

If set to an integer k, training with a validation set will stop if the performance doesn't improve for k rounds.

nfold

the original dataset is randomly partitioned into nfold equal size subsamples.

folds

list provides a possibility to use a list of pre-defined CV folds (each element must be a vector of test fold's indices).

verbose

Whether or not to print progress.

seed

Random State

maximize

Should the loss function be maximized or not?

bounds

A named list of lower and upper bounds for each hyperparameter. Please use "L" suffix to indicate integer hyperparameter.

init_points

Number of randomly chosen points to sample the target function before Bayesian Optimization fitting the Gaussian Process.

n_iter

Total number of times the Bayesian Optimization is to repeated.

init_grid_dt

User specified points to sample the target function, should be a data.frame or data.table with identical column names as bounds.

...

Additional args to be passed to BayesOptim


gm209/gmtools documentation built on May 22, 2019, 2:39 p.m.