gbm_pro | R Documentation |
Trains a Gradient Boosting Machine (GBM) model with a Cox
proportional hazards loss function using gbm
.
gbm_pro(X, y_surv, tune = FALSE, cv.folds = 3)
X |
A data frame of features. |
y_surv |
A |
tune |
Logical, whether to perform simplified hyperparameter tuning.
If |
cv.folds |
Integer. The number of cross-validation folds to use. Setting this to 0 or 1 will disable cross-validation. Defaults to 3. |
A list of class "train" containing the trained gbm
model object,
names of features used in training, and model type. The returned object
also includes fitted_scores
(linear predictor), y_surv
, and best_iter
.
# Generate some dummy survival data
set.seed(42)
n_samples <- 200
n_features <- 5
X_data <- as.data.frame(matrix(rnorm(n_samples * n_features), ncol = n_features))
Y_surv_obj <- survival::Surv(
time = runif(n_samples, 100, 1000),
event = sample(0:1, n_samples, replace = TRUE)
)
# Train the model for the example *without* cross-validation to pass R CMD check
# In real use, you might use the default cv.folds = 3
gbm_model <- gbm_pro(X_data, Y_surv_obj, cv.folds = 0)
print(gbm_model$finalModel)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.