train_lasso | R Documentation |
Do parameter tuning for Lasso and Adaptive lasso
train_lasso(
x,
y,
ada = TRUE,
gamma = 1,
intercept = TRUE,
scalex = FALSE,
lambda_seq = NULL,
train_method = "timeslice",
nlambda = 100,
lambda_min_ratio = 1e-04,
k = 10,
initial_window = ceiling(nrow(x) * 0.7),
horizon = 1,
fixed_window = TRUE,
skip = 0
)
x |
Predictor matrix (n-by-p matrix) |
y |
Response variable |
ada |
A boolean: Do parameter tuning for adaptive Lasso if TRUE (Default) For Lasso if FALSE. |
gamma |
Parameter controlling the inverse of first step estimate. By default = 1. |
intercept |
A boolean: include an intercept term or not |
scalex |
A boolean: standardize the design matrix or not |
lambda_seq |
Candidate sequnece of parameters. If NULL, the function generates the sequnce. |
train_method |
"timeslice", "cv", "cv_random", "aic", "bic", "aicc", "hqc" "timeslice": https://topepo.github.io/caret/data-splitting.html#time By combining initial window, horizon, fixed window and skip, we can control the sample splitting. Roll_block: Setting initial_window = horizon = floor(nrow(x) / k), fixed_window = False, and skip = floor(nrow(x) / k) - 1 Period-by-period rolling: skip = 0. "cv": Cross-validation based on block splits. "cv_random": Cross-validition based on random splits. "aic", "bic", "aicc", "hqc": based on information criterion. |
nlambda |
# of lambdas |
lambda_min_ratio |
# lambda_min_ratio * lambda_max = lambda_min |
k |
k-fold cv if "cv" is chosen |
initial_window |
control "timeslice" |
horizon |
control "timeslice" |
fixed_window |
control "timeslice" |
skip |
control "timeslice" |
bestTune
train_lasso(x,y)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.