View source: R/irb.train_aft.R
| irb.train_aft | R Documentation |
Fit an accelerated failure time model with the iteratively reweighted convex optimization (IRCO) that minimizes the robust loss functions in the CC-family (concave-convex). The convex optimization is conducted by functional descent boosting algorithm in the R package xgboost. The iteratively reweighted boosting (IRBoost) algorithm reduces the weight of the observation that leads to a large loss; it also provides weights to help identify outliers. For time-to-event data, an accelerated failure time model (AFT
model) provides an alternative to the commonly used proportional hazards models. Note, function irboost_aft was developed to facilitate a data input format used with function xgb.train for objective=survival:aft in package xgboost. In other ojective functions, the input format is different with function xgboost at the time.
irb.train_aft(
params = list(),
data,
z_init = NULL,
cfun = "ccave",
s = 1,
delta = 0.1,
iter = 10,
nrounds = 100,
del = 1e-10,
trace = FALSE,
...
)
params |
the list of parameters used in |
data |
training dataset. |
z_init |
vector of nobs with initial convex component values, must be non-negative with default values = weights if provided, otherwise z_init = vector of 1s |
cfun |
concave component of CC-family, can be |
s |
tuning parameter of |
delta |
a small positive number provided by user only if |
iter |
number of iteration in the IRCO algorithm |
nrounds |
boosting iterations in |
del |
convergency criteria in the IRCO algorithm, no relation to |
trace |
if |
... |
other arguments passing to |
An object of class xgb.Booster with additional elements:
weight_update_log a matrix of nobs row by iter column of observation weights in each iteration of the IRCO algorithm
weight_update a vector of observation weights in the last IRCO iteration that produces the final model fit
loss_log sum of loss value of the composite function cfun(survival_aft_distribution) in each IRCO iteration
Zhu Wang
Maintainer: Zhu Wang zwang145@uthsc.edu
Wang, Zhu (2021), Unified Robust Boosting, Journal of Data Science (2024), 1-19, DOI 10.6339/24-JDS1138
irboost
library("xgboost")
X <- matrix(1:5, ncol=1)
# Associate ranged labels with the data matrix.
# This example shows each kind of censored labels.
# uncensored right left interval
y_lower = c(10, 15, -Inf, 30, 100)
y_upper = c(Inf, Inf, 20, 50, Inf)
dtrain <- xgb.DMatrix(
data = X,
label_lower_bound = y_lower,
label_upper_bound = y_upper,
nthread = 1
)
params <- list(
objective = "survival:aft",
nthread = 1,
aft_loss_distribution = "normal",
aft_loss_distribution_scale = 1,
max_depth = 3,
min_child_weight = 0
)
watchlist <- list(train = dtrain)
bst <- xgb.train(params, data=dtrain, nrounds=15, watchlist=watchlist)
predict(bst, dtrain)
bst_cc <- irb.train_aft(params, data=dtrain, nrounds=15, watchlist=watchlist, cfun="hcave",
s=1.5, trace=TRUE, verbose=0)
bst_cc$weight_update
predict(bst_cc, dtrain)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.