gbm_pro: Train a Gradient Boosting Machine (GBM) for Survival Data

View source: R/prognosis.R

gbm_proR Documentation

Train a Gradient Boosting Machine (GBM) for Survival Data

Description

Trains a Gradient Boosting Machine (GBM) model with a Cox proportional hazards loss function using gbm.

Usage

gbm_pro(X, y_surv, tune = FALSE, cv.folds = 3)

Arguments

X

A data frame of features.

y_surv

A survival::Surv object representing the survival outcome.

tune

Logical, whether to perform simplified hyperparameter tuning. If TRUE, n.trees, interaction.depth, and shrinkage are set to predefined values suitable for tuning; otherwise, default values are used.

cv.folds

Integer. The number of cross-validation folds to use. Setting this to 0 or 1 will disable cross-validation. Defaults to 3.

Value

A list of class "train" containing the trained gbm model object, names of features used in training, and model type. The returned object also includes fitted_scores (linear predictor), y_surv, and best_iter.

Examples


# Generate some dummy survival data
set.seed(42)
n_samples <- 200
n_features <- 5
X_data <- as.data.frame(matrix(rnorm(n_samples * n_features), ncol = n_features))
Y_surv_obj <- survival::Surv(
  time = runif(n_samples, 100, 1000),
  event = sample(0:1, n_samples, replace = TRUE)
)

# Train the model for the example *without* cross-validation to pass R CMD check
# In real use, you might use the default cv.folds = 3
gbm_model <- gbm_pro(X_data, Y_surv_obj, cv.folds = 0)
print(gbm_model$finalModel)


E2E documentation built on Aug. 27, 2025, 1:09 a.m.