Description Usage Arguments Value
Training xgboost models
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
df_train |
train data. Should be data.table object |
target_name |
target variable. Should be a string with a variable present in train data. |
var |
variables used to train the model. Should be a character vector. Default value is NULL, indicating that all variables available from f_indicators will be used. |
nrounds |
max number of iterations. Default is 32 |
max.depth |
Max depth of the trees, i.e. the number of splits made. |
eta |
step size of each boosting step. Large eta may lead to unstable results. Default is 1. |
min_child_weight |
minimum counts in a child. The algorithm stops if splitting leads to leaf node with fewer than min_child_weight instances. |
early_stopping_rounds |
stopping criteria. If performance is not improved after k(= early_stopping_rounds) iterations, the algorithm stops. |
subsample |
train proportion for each fold in cross validation. Should be a number between 0 and 1. |
colsample_bytree |
subsample proportion for each tree. Should be a number between 0 and 1. |
gamma |
minimum loss reduction required to make a further partition on a leaf node of the tree. The larger, the more conservative the algorithm will be. |
... |
additional parameters passed to xgboost. |
xgboost object
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.