Description Usage Arguments Value
xgb_train
is a wrapper for xgboost
treebased models where all of the
model arguments are in the main function.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 
x 
A data frame or matrix of predictors 
y 
A vector (factor or numeric) or matrix (numeric) of outcome data. 
max_depth 
An integer for the maximum depth of the tree. 
nrounds 
An integer for the number of boosting iterations. 
eta 
A numeric value between zero and one to control the learning rate. 
colsample_bynode 
Subsampling proportion of columns for each node
within each tree. See the 
colsample_bytree 
Subsampling proportion of columns for each tree.
See the 
min_child_weight 
A numeric value for the minimum sum of instance weights needed in a child to continue to split. 
gamma 
A number for the minimum loss reduction required to make a further partition on a leaf node of the tree 
subsample 
Subsampling proportion of rows. By default, all of the training data are used. 
validation 
The proportion of the data that are used for performance assessment and potential early stopping. 
early_stop 
An integer or 
objective 
A single string (or NULL) that defines the loss function that

counts 
A logical. If 
event_level 
For binary classification, this is a single string of either

... 
Other options to pass to 
A fitted xgboost
object.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.