h2o_gbm_train: Wrapper for training a h2o.gbm model as part of a parsnip...

View source: R/boost_tree.R

h2o_gbm_trainR Documentation

Wrapper for training a h2o.gbm model as part of a parsnip 'boost_tree' h2o engine

Description

Wrapper for training a h2o.gbm model as part of a parsnip 'boost_tree' h2o engine

Usage

h2o_gbm_train(
  formula,
  data,
  ntrees = 50,
  max_depth = 5,
  min_rows = 10,
  learn_rate = 0.1,
  sample_rate = 1,
  col_sample_rate = 1,
  min_split_improvement = 1e-05,
  stopping_rounds = 0,
  validation = 0,
  algorithm = "h2o.gbm",
  ...
)

Arguments

formula

formula

data

data.frame of training data

ntrees

integer, the number of trees to build (default = 50).

max_depth

integer, the maximum tree depth (default = 10).

min_rows

integer, the minimum number of observations for a leaf (default = 10).

learn_rate

numeric, the learning rate (default = 0.1, range is from 0.0 to 1.0).

sample_rate

numeric, the proportion of samples to use to build each tree (default = 1.0).

col_sample_rate

numeric, the proportion of features available during each node split (default = 1.0).

min_split_improvement

numeric, minimum relative improvement in squared error reduction in order for a split to happen (default = 1e-05)

stopping_rounds

An integer specifying the number of training iterations without improvement before stopping. If 'stopping_rounds = 0' (the default) then early stopping is disabled. If 'validation' is used, performance is base on the validation set; otherwise the training set is used.

validation

A positive number. If on '[0, 1)' the value, 'validation' is a random proportion of data in 'x' and 'y' that are used for performance assessment and potential early stopping. If 1 or greater, it is the _number_ of training set samples use for these purposes.

algorithm

Whether to use the default h2o 'h2o.gbm' algorithm or use 'h2o.xgboost' via h2o.

...

other arguments passed to the h2o engine.

Value

evaluated h2o model call


stevenpawley/h2oparsnip documentation built on June 20, 2022, 12:48 p.m.