mlr_learners_surv.glmnet: GLM with Elastic Net Regularization Survival Learner

mlr_learners_surv.glmnetR Documentation

GLM with Elastic Net Regularization Survival Learner

Description

Generalized linear models with elastic net regularization. Calls glmnet::glmnet() from package glmnet.

Initial parameter values

  • family is set to "cox" and cannot be changed.

Prediction types

This learner returns three prediction types:

  1. lp: a vector containing the linear predictors (relative risk scores), where each score corresponds to a specific test observation. Calculated using glmnet::predict.coxnet().

  2. crank: same as lp.

  3. distr: a survival matrix in two dimensions, where observations are represented in rows and time points in columns. Calculated using glmnet::survfit.coxnet(). Parameters stype and ctype relate to how lp predictions are transformed into survival predictions and are described in survival::survfit.coxph(). By default the Breslow estimator is used for computing the baseline hazard.

Caution: This learner is different to learners calling glmnet::cv.glmnet() in that it does not use the internal optimization of parameter lambda. Instead, lambda needs to be tuned by the user (e.g., via mlr3tuning). When lambda is tuned, the glmnet will be trained for each tuning iteration. While fitting the whole path of lambdas would be more efficient, as is done by default in glmnet::glmnet(), tuning/selecting the parameter at prediction time (using parameter s) is currently not supported in mlr3 (at least not in efficient manner). Tuning the s parameter is, therefore, currently discouraged.

When the data are i.i.d. and efficiency is key, we recommend using the respective auto-tuning counterpart in mlr_learners_surv.cv_glmnet(). However, in some situations this is not applicable, usually when data are imbalanced or not i.i.d. (longitudinal, time-series) and tuning requires custom resampling strategies (blocked design, stratification).

Dictionary

This Learner can be instantiated via lrn():

lrn("surv.glmnet")

Meta Information

  • Task type: “surv”

  • Predict Types: “crank”, “distr”, “lp”

  • Feature Types: “logical”, “integer”, “numeric”

  • Required Packages: mlr3, mlr3proba, mlr3extralearners, glmnet

Parameters

Id Type Default Levels Range
alignment character lambda lambda, fraction -
alpha numeric 1 [0, 1]
big numeric 9.9e+35 (-\infty, \infty)
devmax numeric 0.999 [0, 1]
dfmax integer - [0, \infty)
eps numeric 1e-06 [0, 1]
epsnr numeric 1e-08 [0, 1]
exact logical FALSE TRUE, FALSE -
exclude untyped - -
exmx numeric 250 (-\infty, \infty)
fdev numeric 1e-05 [0, 1]
gamma untyped - -
grouped logical TRUE TRUE, FALSE -
intercept logical TRUE TRUE, FALSE -
keep logical FALSE TRUE, FALSE -
lambda untyped - -
lambda.min.ratio numeric - [0, 1]
lower.limits untyped -Inf -
maxit integer 100000 [1, \infty)
mnlam integer 5 [1, \infty)
mxit integer 100 [1, \infty)
mxitnr integer 25 [1, \infty)
newoffset untyped - -
nlambda integer 100 [1, \infty)
offset untyped NULL -
parallel logical FALSE TRUE, FALSE -
penalty.factor untyped - -
pmax integer - [0, \infty)
pmin numeric 1e-09 [0, 1]
prec numeric 1e-10 (-\infty, \infty)
predict.gamma numeric gamma.1se (-\infty, \infty)
relax logical FALSE TRUE, FALSE -
s numeric 0.01 [0, \infty)
standardize logical TRUE TRUE, FALSE -
thresh numeric 1e-07 [0, \infty)
trace.it integer 0 [0, 1]
type.logistic character Newton Newton, modified.Newton -
type.multinomial character ungrouped ungrouped, grouped -
upper.limits untyped Inf -
stype integer 2 [1, 2]
ctype integer - [1, 2]

Super classes

mlr3::Learner -> mlr3proba::LearnerSurv -> LearnerSurvGlmnet

Methods

Public methods

Inherited methods

Method new()

Creates a new instance of this R6 class.

Usage
LearnerSurvGlmnet$new()

Method selected_features()

Returns the set of selected features as reported by glmnet::predict.glmnet() with type set to "nonzero".

Usage
LearnerSurvGlmnet$selected_features(lambda = NULL)
Arguments
lambda

(numeric(1))
Custom lambda, defaults to the active lambda depending on parameter set.

Returns

(character()) of feature names.


Method clone()

The objects of this class are cloneable with this method.

Usage
LearnerSurvGlmnet$clone(deep = FALSE)
Arguments
deep

Whether to make a deep clone.

Author(s)

be-marc

References

Friedman J, Hastie T, Tibshirani R (2010). “Regularization Paths for Generalized Linear Models via Coordinate Descent.” Journal of Statistical Software, 33(1), 1–22. \Sexpr[results=rd]{tools:::Rd_expr_doi("10.18637/jss.v033.i01")}.

See Also

Examples


# Define the Learner
learner = mlr3::lrn("surv.glmnet")
print(learner)

# Define a Task
task = mlr3::tsk("grace")

# Create train and test set
ids = mlr3::partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

print(learner$model)


# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()


mlr-org/mlr3extralearners documentation built on Dec. 21, 2024, 2:21 p.m.