xgb_opt: Bayesian Optimization for XGboost

Description Usage Arguments Value Examples

Description

This function estimates parameters for xgboost based on bayesian optimization.

Usage

1
2
3
4
5
6
xgb_opt(train_data, train_label, test_data, test_label, objectfun, evalmetric,
  eta_range = c(0.1, 1L), max_depth_range = c(4L, 6L),
  nrounds_range = c(70, 160L), subsample_range = c(0.1, 1L),
  bytree_range = c(0.4, 1L), init_points = 4, n_iter = 10, acq = "ei",
  kappa = 2.576, eps = 0, optkernel = list(type = "exponential", power =
  2), classes = NULL)

Arguments

train_data

A data frame for training of xgboost

train_label

The column of class to classify in the training data

test_data

A data frame for training of xgboost

test_label

The column of class to classify in the test data

objectfun

Specify the learning task and the corresponding learning objective

  • reg:linear linear regression (Default).

  • reg:logistic logistic regression.

  • binary:logistic logistic regression for binary classification. Output probability.

  • binary:logitraw logistic regression for binary classification, output score before logistic transformation.

  • multi:softmax set xgboost to do multiclass classification using the softmax objective. Class is represented by a number and should be from 0 to num_class - 1.

  • multi:softprob same as softmax, but prediction outputs a vector of ndata * nclass elements, which can be further reshaped to ndata, nclass matrix. The result contains predicted probabilities of each data point belonging to each class.

  • rank:pairwise set xgboost to do ranking task by minimizing the pairwise loss.

evalmetric

evaluation metrics for validation data. Users can pass a self-defined function to it. Default: metric will be assigned according to objective(rmse for regression, and error for classification, mean average precision for ranking).

  • error binary classification error rate

  • rmse Rooted mean square error

  • logloss negative log-likelihood function

  • auc Area under curve

  • merror Exact matching error, used to evaluate multi-class classification

eta_range

The range of eta

max_depth_range

The range of max_depth

nrounds_range

The range of nrounds

subsample_range

The range of subsample rate

bytree_range

The range of colsample_bytree rate

init_points

Number of randomly chosen points to sample the target function before Bayesian Optimization fitting the Gaussian Process.

n_iter

Total number of times the Bayesian Optimization is to repeated.

acq

Acquisition function type to be used. Can be "ucb", "ei" or "poi".

  • ucb GP Upper Confidence Bound

  • ei Expected Improvement

  • poi Probability of Improvement

kappa

tunable parameter kappa of GP Upper Confidence Bound, to balance exploitation against exploration, increasing kappa will make the optimized hyperparameters pursuing exploration.

eps

tunable parameter epsilon of Expected Improvement and Probability of Improvement, to balance exploitation against exploration, increasing epsilon will make the optimized hyperparameters are more spread out across the whole range.

optkernel

Kernel (aka correlation function) for the underlying Gaussian Process. This parameter should be a list that specifies the type of correlation function along with the smoothness parameter. Popular choices are square exponential (default) or matern 5/2

classes

set the number of classes. To use only with multiclass objectives.

Value

The test accuracy and a list of Bayesian Optimization result is returned:

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
## Not run: 
library(MlBayesOpt)

set.seed(71)
res0 <- xgb_opt(train_data = fashion_train,
                train_label = y,
                test_data = fashion_test,
                test_label = y,
                objectfun = "multi:softmax",
                evalmetric = "merror",
                classes = 10,
                init_points = 3,
                n_iter = 1)

## End(Not run)

ymattu/MlBayesOpt documentation built on May 4, 2019, 5:31 p.m.