cvboost: Gradient boosting for regression and classification with...

Description Usage Arguments Value Examples

View source: R/cvboost.R

Description

Gradient boosting for regression and classification with cross validation to search for hyper-parameters (implemented with xgboost)

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
cvboost(
  x,
  y,
  weights = NULL,
  k_folds = NULL,
  objective = c("reg:squarederror", "binary:logistic"),
  ntrees_max = 1000,
  num_search_rounds = 10,
  print_every_n = 100,
  early_stopping_rounds = 10,
  nthread = NULL,
  verbose = FALSE
)

Arguments

x

the input features

y

the observed response (real valued)

weights

weights for input if doing weighted regression/classification. If set to NULL, no weights are used

k_folds

number of folds used in cross validation

objective

choose from either "reg:squarederror" for regression or "binary:logistic" for logistic regression

ntrees_max

the maximum number of trees to grow for xgboost

num_search_rounds

the number of random sampling of hyperparameter combinations for cross validating on xgboost trees

print_every_n

the number of iterations (in each iteration, a tree is grown) by which the code prints out information

early_stopping_rounds

the number of rounds the test error stops decreasing by which the cross validation in finding the optimal number of trees stops

nthread

the number of threads to use. The default is NULL, which uses all available threads. Note that this does not apply to using bayesian optimization to search for hyperparameters.

verbose

boolean; whether to print statistic

Value

a cvboost object

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
## Not run: 
n = 100; p = 10

x = matrix(rnorm(n*p), n, p)
y = pmax(x[,1], 0) + x[,2] + pmin(x[,3], 0) + rnorm(n)

fit = cvboost(x, y, objective="reg:squarederror")
est = predict(fit, x)

## End(Not run)

xnie/rlearner documentation built on April 11, 2021, 12:49 a.m.