train_lightgbm | R Documentation |
train_lightgbm
is a wrapper for lightgbm
tree-based models
where all of the model arguments are in the main function.
train_lightgbm( x, y, max_depth = -1, num_iterations = 100, learning_rate = 0.1, feature_fraction_bynode = 1, min_data_in_leaf = 20, min_gain_to_split = 0, bagging_fraction = 1, early_stopping_round = NULL, validation = 0, counts = TRUE, quiet = FALSE, ... )
x |
A data frame or matrix of predictors |
y |
A vector (factor or numeric) or matrix (numeric) of outcome data. |
max_depth |
An integer for the maximum depth of the tree. |
num_iterations |
An integer for the number of boosting iterations. |
learning_rate |
A numeric value between zero and one to control the learning rate. |
feature_fraction_bynode |
Fraction of predictors that will be randomly sampled at each split. |
min_data_in_leaf |
A numeric value for the minimum sum of instances needed in a child to continue to split. |
min_gain_to_split |
A number for the minimum loss reduction required to make a further partition on a leaf node of the tree. |
bagging_fraction |
Subsampling proportion of rows. Setting this argument
to a non-default value will also set |
early_stopping_round |
Number of iterations without an improvement in the objective function occur before training should be halted. |
validation |
The proportion of the training data that are used for performance assessment and potential early stopping. |
counts |
A logical; should |
quiet |
A logical; should logging by |
... |
Other options to pass to |
This is an internal function, not meant to be directly called by the user.
A fitted lightgbm.Model
object.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.