mlr_learners_surv.blackboost: Gradient Boosting with Regression Trees Survival Learner

mlr_learners_surv.blackboostR Documentation

Gradient Boosting with Regression Trees Survival Learner

Description

Gradient boosting with regression trees for survival analysis. Calls mboost::blackboost() from mboost.

Details

distr prediction made by mboost::survFit().

Prediction types

This learner returns two to three prediction types:

  1. lp: a vector containing the linear predictors (relative risk scores), where each score corresponds to a specific test observation. Calculated using mboost::predict.blackboost(). If the family parameter is not "coxph", -lp is returned, since non-coxph families represent AFT-style distributions where lower lp values indicate higher risk.

  2. crank: same as lp.

  3. distr: a survival matrix in two dimensions, where observations are represented in rows and time points in columns. Calculated using mboost::survFit(). This prediction type is present only when the family distribution parameter is equal to "coxph" (default). By default the Breslow estimator is used for computing the baseline hazard.

Dictionary

This Learner can be instantiated via lrn():

lrn("surv.blackboost")

Meta Information

Parameters

Id Type Default Levels Range
family character coxph coxph, weibull, loglog, lognormal, gehan, cindex, custom -
custom.family untyped - -
nuirange untyped c(0, 100) -
offset untyped - -
center logical TRUE TRUE, FALSE -
mstop integer 100 [0, \infty)
nu numeric 0.1 [0, 1]
risk character - inbag, oobag, none -
stopintern logical FALSE TRUE, FALSE -
trace logical FALSE TRUE, FALSE -
oobweights untyped - -
teststat character quadratic quadratic, maximum -
splitstat character quadratic quadratic, maximum -
splittest logical FALSE TRUE, FALSE -
testtype character Bonferroni Bonferroni, MonteCarlo, Univariate, Teststatistic -
maxpts integer 25000 [1, \infty)
abseps numeric 0.001 (-\infty, \infty)
releps numeric 0 (-\infty, \infty)
nmax untyped - -
alpha numeric 0.05 [0, 1]
mincriterion numeric 0.95 [0, 1]
logmincriterion numeric -0.05129329 (-\infty, 0]
minsplit integer 20 [0, \infty)
minbucket integer 7 [0, \infty)
minprob numeric 0.01 [0, 1]
stump logical FALSE TRUE, FALSE -
lookahead logical FALSE TRUE, FALSE -
MIA logical FALSE TRUE, FALSE -
nresample integer 9999 [1, \infty)
tol numeric 1.490116e-08 [0, \infty)
maxsurrogate integer 0 [0, \infty)
mtry integer - [0, \infty)
maxdepth integer - [0, \infty)
multiway logical FALSE TRUE, FALSE -
splittry integer 2 [1, \infty)
intersplit logical FALSE TRUE, FALSE -
majority logical FALSE TRUE, FALSE -
caseweights logical TRUE TRUE, FALSE -
sigma numeric 0.1 [0, 1]
ipcw untyped 1 -
na.action untyped stats::na.omit -

Super classes

mlr3::Learner -> mlr3proba::LearnerSurv -> LearnerSurvBlackBoost

Methods

Public methods

Inherited methods

Method new()

Creates a new instance of this R6 class.

Usage
LearnerSurvBlackBoost$new()

Method clone()

The objects of this class are cloneable with this method.

Usage
LearnerSurvBlackBoost$clone(deep = FALSE)
Arguments
deep

Whether to make a deep clone.

Author(s)

RaphaelS1

References

Bühlmann, Peter, Yu, Bin (2003). “Boosting with the L 2 loss: regression and classification.” Journal of the American Statistical Association, 98(462), 324–339.

See Also

Examples


# Define the Learner
learner = mlr3::lrn("surv.blackboost")
print(learner)

# Define a Task
task = mlr3::tsk("grace")

# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()


mlr-org/mlr3extralearners documentation built on Nov. 11, 2024, 11:11 a.m.