View source: R/utilsxgboost.R
xgboost_impl  R Documentation 
Wrapper for parsnip::xgb_train
xgboost_impl( x, y, max_depth = 6, nrounds = 15, eta = 0.3, colsample_bynode = NULL, colsample_bytree = NULL, min_child_weight = 1, gamma = 0, subsample = 1, validation = 0, early_stop = NULL, objective = NULL, counts = TRUE, event_level = c("first", "second"), ... )
x 
A data frame or matrix of predictors 
y 
A vector (factor or numeric) or matrix (numeric) of outcome data. 
max_depth 
An integer for the maximum depth of the tree. 
nrounds 
An integer for the number of boosting iterations. 
eta 
A numeric value between zero and one to control the learning rate. 
colsample_bynode 
Subsampling proportion of columns for each node
within each tree. See the 
colsample_bytree 
Subsampling proportion of columns for each tree.
See the 
min_child_weight 
A numeric value for the minimum sum of instance weights needed in a child to continue to split. 
gamma 
A number for the minimum loss reduction required to make a further partition on a leaf node of the tree 
subsample 
Subsampling proportion of rows. By default, all of the training data are used. 
validation 
A positive number. If on 
early_stop 
An integer or 
objective 
A single string (or NULL) that defines the loss function that

counts 
A logical. If 
event_level 
For binary classification, this is a single string of either

... 
Other options to pass to 
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.