tune_xgbp: Tune XGBoost model

View source: R/tuning.R

tune_xgbpR Documentation

Tune XGBoost model

Description

tune_xgbp tunes the parameters of XGBoost to improve first stage predicitons and avid overfitting in the XGBP.

Usage

tune_xgbp(
  survey,
  ...,
  dep_var = NULL,
  nrounds = 100,
  nthread = 1,
  n_iter = 25,
  seed = NULL
)

Arguments

survey

A tibble created by the xgbp function (passed internally).

...

Individual and group level covariates used in the poststratification. All variables must be included in the survey and in the census and passed unquoted to the function call

dep_var

Dependent variable. Must be character or factor

nrounds

Number of trees (rounds) used in to train the model. Defaults to 100

nthread

Number of htreads used in the computation. Defaults to 1, but users are encourage to increase this number to speed up computations (the limit is the actual number of threads available at your computer)

n_iter

When tune = TRUE, this indicates how many samples to draw during gridsearch to use. Defaults to 30.

seed

A seed for replication. Defaults to NULL

Value

A list with two elements:

  • A list with the best parameters selected during gridsearch

  • A vector with the optimal number of trees selected during gridsearch


meirelesff/xgbp documentation built on Sept. 24, 2022, 1:48 p.m.