mlr_learners_classif.cforest | R Documentation |
A random forest based on conditional inference trees (ctree).
Calls partykit::cforest()
from partykit.
This Learner can be instantiated via lrn():
lrn("classif.cforest")
Task type: “classif”
Predict Types: “response”, “prob”
Feature Types: “integer”, “numeric”, “factor”, “ordered”
Required Packages: mlr3, mlr3extralearners, partykit, sandwich, coin
Id | Type | Default | Levels | Range |
ntree | integer | 500 | [1, \infty) |
|
replace | logical | FALSE | TRUE, FALSE | - |
fraction | numeric | 0.632 | [0, 1] |
|
mtry | integer | - | [0, \infty) |
|
mtryratio | numeric | - | [0, 1] |
|
applyfun | untyped | - | - | |
cores | integer | NULL | (-\infty, \infty) |
|
trace | logical | FALSE | TRUE, FALSE | - |
offset | untyped | - | - | |
cluster | untyped | - | - | |
scores | untyped | - | - | |
teststat | character | quadratic | quadratic, maximum | - |
splitstat | character | quadratic | quadratic, maximum | - |
splittest | logical | FALSE | TRUE, FALSE | - |
testtype | character | Univariate | Bonferroni, MonteCarlo, Univariate, Teststatistic | - |
nmax | untyped | - | - | |
pargs | untyped | - | - | |
alpha | numeric | 0.05 | [0, 1] |
|
mincriterion | numeric | 0 | [0, 1] |
|
logmincriterion | numeric | 0 | (-\infty, \infty) |
|
minsplit | integer | 20 | [1, \infty) |
|
minbucket | integer | 7 | [1, \infty) |
|
minprob | numeric | 0.01 | [0, 1] |
|
stump | logical | FALSE | TRUE, FALSE | - |
lookahead | logical | FALSE | TRUE, FALSE | - |
MIA | logical | FALSE | TRUE, FALSE | - |
nresample | integer | 9999 | [1, \infty) |
|
tol | numeric | 1.490116e-08 | [0, \infty) |
|
maxsurrogate | integer | 0 | [0, \infty) |
|
numsurrogate | logical | FALSE | TRUE, FALSE | - |
maxdepth | integer | Inf | [0, \infty) |
|
multiway | logical | FALSE | TRUE, FALSE | - |
splittry | integer | 2 | [0, \infty) |
|
intersplit | logical | FALSE | TRUE, FALSE | - |
majority | logical | FALSE | TRUE, FALSE | - |
caseweights | logical | TRUE | TRUE, FALSE | - |
saveinfo | logical | FALSE | TRUE, FALSE | - |
update | logical | FALSE | TRUE, FALSE | - |
splitflavour | character | ctree | ctree, exhaustive | - |
maxvar | integer | - | [1, \infty) |
|
OOB | logical | FALSE | TRUE, FALSE | - |
simplify | logical | TRUE | TRUE, FALSE | - |
scale | logical | TRUE | TRUE, FALSE | - |
nperm | integer | 1 | [0, \infty) |
|
risk | character | loglik | loglik, misclassification | - |
conditional | logical | FALSE | TRUE, FALSE | - |
threshold | numeric | 0.2 | (-\infty, \infty) |
|
mtry
:
This hyperparameter can alternatively be set via the added hyperparameter mtryratio
as mtry = max(ceiling(mtryratio * n_features), 1)
.
Note that mtry
and mtryratio
are mutually exclusive.
mlr3::Learner
-> mlr3::LearnerClassif
-> LearnerClassifCForest
new()
Creates a new instance of this R6 class.
LearnerClassifCForest$new()
oob_error()
The importance scores are calculated using partykit::varimp()
.
The out-of-bag error, calculated using the OOB predictions from
partykit
.
LearnerClassifCForest$oob_error()
numeric(1)
.
clone()
The objects of this class are cloneable with this method.
LearnerClassifCForest$clone(deep = FALSE)
deep
Whether to make a deep clone.
sumny
Hothorn T, Zeileis A (2015). “partykit: A Modular Toolkit for Recursive Partytioning in R.” Journal of Machine Learning Research, 16(118), 3905-3909. http://jmlr.org/papers/v16/hothorn15a.html.
Hothorn T, Hornik K, Zeileis A (2006). “Unbiased Recursive Partitioning: A Conditional Inference Framework.” Journal of Computational and Graphical Statistics, 15(3), 651–674. \Sexpr[results=rd]{tools:::Rd_expr_doi("10.1198/106186006x133933")}, https://doi.org/10.1198/106186006x133933.
Dictionary of Learners: mlr3::mlr_learners.
as.data.table(mlr_learners)
for a table of available Learners in the running session (depending on the loaded packages).
Chapter in the mlr3book: https://mlr3book.mlr-org.com/basics.html#learners
mlr3learners for a selection of recommended learners.
mlr3cluster for unsupervised clustering learners.
mlr3pipelines to combine learners with pre- and postprocessing steps.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
task = tsk("iris")
learner = lrn("classif.cforest", ntree = 50)
splits = partition(task)
learner$train(task, splits$train)
pred = learner$predict(task, splits$test)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.