Description Usage Arguments Value Author(s) References See Also Examples
Fit LAD-lasso models and select the penalty parameter by estimating the respective prediction error via (repeated) K-fold cross-validation, (repeated) random splitting (also known as random subsampling or Monte Carlo cross-validation), or the bootstrap.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
x |
a numeric matrix containing the predictor variables. |
y |
a numeric vector containing the response variable. |
lambda |
for |
standardize |
a logical indicating whether the predictor variables
should be standardized to have unit MAD (the default is |
intercept |
a logical indicating whether a constant term should be
included in the model (the default is |
splits |
an object giving data splits to be used for prediction error
estimation (see |
cost |
a cost function measuring prediction loss (see
|
selectBest, seFactor |
arguments specifying a criterion for selecting
the best model (see |
ncores, cl |
arguments for parallel computing (see
|
seed |
optional initial seed for the random number generator (see
|
... |
for |
For ladlasso
, an object of class "perryTuning"
, see
perryTuning
). It contains information on the
prediction error criterion, and includes the final model with the optimal
tuning paramter as component finalModel
.
For ladlasso.fit
, an object of class ladlasso
with the
following components:
lambda
numeric; the value of the penalty parameter.
coefficients
a numeric vector containing the coefficient estimates.
fitted.values
a numeric vector containing the fitted values.
residuals
a numeric vector containing the residuals.
standardize
a logical indicating whether the predictor variables were standardized to have unit MAD.
intercept
a logical indicating whether the model includes a constant term.
muX
a numeric vector containing the medians of the predictors.
sigmaX
a numeric vector containing the MADs of the predictors.
muY
numeric; the median of the response.
call
the matched function call.
Andreas Alfons
Wang, H., Li, G. and Jiang, G. (2007) Robust regression shrinkage and consistent variable selection through the LAD-lasso. Journal of Business & Economic Statistics, 25(3), 347–355.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 | ## load data
data("Bundesliga")
Bundesliga <- Bundesliga[, -(1:2)]
f <- log(MarketValue) ~ Age + I(Age^2) + .
mf <- model.frame(f, data=Bundesliga)
x <- model.matrix(terms(mf), mf)[, -1]
y <- model.response(mf)
## set up repeated random splits
splits <- splitControl(m = 40, R = 10)
## select optimal penalty parameter
lambda <- seq(40, 0, length.out = 20)
fit <- ladlasso(x, y, lambda = lambda, splits = splits, seed = 2014)
fit
## plot prediction error results
plot(fit, method = "line")
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.