#| child: aaa.Rmd
#| include: false

r descr_models("boost_tree", "C5.0")

Tuning Parameters

#| label: C5.0-param-info
#| echo: false
defaults <- 
  tibble::tibble(parsnip = c("trees", "min_n", "sample_size"),
                 default = c("15L", "2L", "1.0"))

param <-
 boost_tree() |> 
  set_engine("C5.0") |> 
  set_mode("classification") |> 
  make_parameter_list(defaults)

This model has r nrow(param) tuning parameters:

#| label: C5.0-param-list
#| echo: false
#| results: asis
param$item

The implementation of C5.0 limits the number of trees to be between 1 and 100.

Translation from parsnip to the original package (classification)

#| label: C5.0-cls
boost_tree(trees = integer(), min_n = integer(), sample_size = numeric()) |> 
  set_engine("C5.0") |> 
  set_mode("classification") |> 
  translate()

[C5.0_train()] is a wrapper around [C50::C5.0()] that makes it easier to run this model.

Preprocessing requirements

#| child: template-tree-split-factors.Rmd

Case weights

#| child: template-uses-case-weights.Rmd

Saving fitted model objects

#| child: template-butcher.Rmd

Other details

Early stopping

By default, early stopping is used. To use the complete set of boosting iterations, pass earlyStopping = FALSE to [set_engine()]. Also, it is unlikely that early stopping will occur if sample_size = 1.

Examples

The "Fitting and Predicting with parsnip" article contains examples for boost_tree() with the "C5.0" engine.

References



Try the parsnip package in your browser

Any scripts or data that you put into this service are public.

parsnip documentation built on June 8, 2025, 12:10 p.m.