train_lightgbm: Boosted trees via lightgbm

View source: R/lightgbm.R

train_lightgbmR Documentation

Boosted trees via lightgbm

Description

train_lightgbm is a wrapper for lightgbm tree-based models where all of the model arguments are in the main function.

Usage

train_lightgbm(
  x,
  y,
  max_depth = 17,
  num_iterations = 10,
  learning_rate = 0.1,
  feature_fraction = 1,
  min_data_in_leaf = 20,
  min_gain_to_split = 0,
  bagging_fraction = 1,
  quiet = FALSE,
  ...
)

Arguments

x

A data frame or matrix of predictors

y

A vector (factor or numeric) or matrix (numeric) of outcome data.

max_depth

An integer for the maximum depth of the tree.

num_iterations

An integer for the number of boosting iterations.

learning_rate

A numeric value between zero and one to control the learning rate.

feature_fraction

Subsampling proportion of columns.

min_data_in_leaf

A numeric value for the minimum sum of instances needed in a child to continue to split.

min_gain_to_split

A number for the minimum loss reduction required to make a further partition on a leaf node of the tree.

bagging_fraction

Subsampling proportion of rows.

quiet

A logical; should logging by lightgbm::lgb.train() be muted?

...

Other options to pass to lightgbm::lgb.train().

Value

A fitted lightgbm.Model object.


curso-r/treesnip documentation built on May 7, 2022, 1:10 a.m.