boost: Boost an 'rtemis' learner for regression

View source: R/boost.R

boostR Documentation

Boost an rtemis learner for regression

Description

Train an ensemble using boosting of any learner

Usage

boost(
  x,
  y = NULL,
  x.valid = NULL,
  y.valid = NULL,
  x.test = NULL,
  y.test = NULL,
  mod = "cart",
  resid = NULL,
  boost.obj = NULL,
  mod.params = list(),
  case.p = 1,
  weights = NULL,
  learning.rate = 0.1,
  earlystop.params = setup.earlystop(window = 30, window_decrease_pct_min = 0.01),
  earlystop.using = "train",
  tolerance = 0,
  tolerance.valid = 1e-05,
  max.iter = 10,
  init = NULL,
  x.name = NULL,
  y.name = NULL,
  question = NULL,
  base.verbose = FALSE,
  verbose = TRUE,
  trace = 0,
  print.progress.every = 5,
  print.error.plot = "final",
  prefix = NULL,
  plot.theme = rtTheme,
  plot.fitted = NULL,
  plot.predicted = NULL,
  print.plot = FALSE,
  print.base.plot = FALSE,
  plot.type = "l",
  outdir = NULL,
  ...
)

Arguments

x

Numeric vector or matrix / data frame of features i.e. independent variables

y

Numeric vector of outcome, i.e. dependent variable

x.valid

Data.frame; optional: Validation data

y.valid

Float, vector; optional: Validation outcome

x.test

Numeric vector or matrix / data frame of testing set features Columns must correspond to columns in x

y.test

Numeric vector of testing set outcome

mod

Character: Algorithm to train base learners, for options, see select_learn. Default = "cart"

resid

Float, vector, length = length(y): Residuals to work on. Do not change unless you know what you're doing. Default = NULL, for regular boosting

boost.obj

(Internal use)

mod.params

Named list of arguments for mod

case.p

Float (0, 1]: Train each iteration using this perceent of cases. Default = 1, i.e. use all cases

weights

Numeric vector: Weights for cases. For classification, weights takes precedence over ifw, therefore set weights = NULL if using ifw. Note: If weight are provided, ifw is not used. Leave NULL if setting ifw = TRUE.

learning.rate

Float (0, 1] Learning rate for the additive steps

earlystop.params

List with early stopping parameters. Set using setup.earlystop

earlystop.using

Character: "train" or "valid". For the latter, requires x.valid

tolerance

Float: If training error <= this value, training stops

tolerance.valid

Float: If validation error <= this value, training stops

max.iter

Integer: Maximum number of iterations (additive steps) to perform. Default = 10

init

Float: Initial value for prediction. Default = mean(y)

x.name

Character: Name for feature set

y.name

Character: Name for outcome

question

Character: the question you are attempting to answer with this model, in plain language.

base.verbose

Logical: verbose argument passed to learner

verbose

Logical: If TRUE, print summary to screen.

trace

Integer: If > 0, print diagnostic info to console

print.progress.every

Integer: Print progress over this many iterations

print.error.plot

String or Integer: "final" plots a training and validation (if available) error curve at the end of training. If integer, plot training and validation error curve every this many iterations during training. "none" for no plot.

prefix

Internal

plot.theme

Character: "zero", "dark", "box", "darkbox"

plot.fitted

Logical: if TRUE, plot True (y) vs Fitted

plot.predicted

Logical: if TRUE, plot True (y.test) vs Predicted. Requires x.test and y.test

print.plot

Logical: if TRUE, produce plot using mplot3 Takes precedence over plot.fitted and plot.predicted.

print.base.plot

Logical: Passed to print.plot argument of base learner, i.e. if TRUE, print error plot for each base learner

plot.type

Character: "l" or "p". Plot using lines or points.

outdir

Path to output directory. If defined, will save Predicted vs. True plot, if available, as well as full model output, if save.mod is TRUE

...

Additional parameters to be passed to learner define by mod

Details

If learning.rate is set to 0, a nullmod will be created

Author(s)

E.D. Gennatas


egenn/rtemis documentation built on Dec. 17, 2024, 6:16 p.m.