Description Usage Arguments Value Author(s) References See Also Examples
Fit lasso models and select the penalty parameter by estimating the respective prediction error via (repeated) K-fold cross-validation, (repeated) random splitting (also known as random subsampling or Monte Carlo cross-validation), or the bootstrap.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 | lasso(
x,
y,
lambda = seq(1, 0, length.out = 50),
mode = c("fraction", "lambda"),
standardize = TRUE,
intercept = TRUE,
splits = foldControl(),
cost = rmspe,
selectBest = c("hastie", "min"),
seFactor = 1,
ncores = 1,
cl = NULL,
seed = NULL,
...
)
lasso.fit(
x,
y,
lambda = seq(1, 0, length.out = 50),
mode = c("fraction", "lambda"),
standardize = TRUE,
intercept = TRUE,
...
)
|
x |
a numeric matrix containing the predictor variables. |
y |
a numeric vector containing the response variable. |
lambda |
for |
mode |
a character string specifying the type of penalty parameter. If
|
standardize |
a logical indicating whether the predictor variables
should be standardized to have unit variance (the default is |
intercept |
a logical indicating whether a constant term should be
included in the model (the default is |
splits |
an object giving data splits to be used for prediction error
estimation (see |
cost |
a cost function measuring prediction loss (see
|
selectBest, seFactor |
arguments specifying a criterion for selecting
the best model (see |
ncores, cl |
arguments for parallel computing (see
|
seed |
optional initial seed for the random number generator (see
|
... |
for |
For lasso, an object of class "perryTuning", see
perryTuning). It contains information on the
prediction error criterion, and includes the final model with the optimal
tuning paramter as component finalModel.
For lasso.fit, an object of class lasso with the following
components:
lambdanumeric; the value of the penalty parameter.
coefficientsa numeric vector containing the coefficient estimates.
fitted.valuesa numeric vector containing the fitted values.
residualsa numeric vector containing the residuals.
standardizea logical indicating whether the predictor variables were standardized to have unit variance.
intercepta logical indicating whether the model includes a constant term.
muXa numeric vector containing the means of the predictors.
sigmaXa numeric vector containing the standard deviations of the predictors.
munumeric; the mean of the response.
callthe matched function call.
Andreas Alfons
Tibshirani, R. (1996) Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society, Series B, 58(1), 267–288.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 | ## load data
data("Bundesliga")
Bundesliga <- Bundesliga[, -(1:2)]
f <- log(MarketValue) ~ Age + I(Age^2) + .
mf <- model.frame(f, data=Bundesliga)
x <- model.matrix(terms(mf), mf)[, -1]
y <- model.response(mf)
## set up repeated random splits
splits <- splitControl(m = 40, R = 10)
## select optimal penalty parameter
fit <- lasso(x, y, splits = splits, seed = 2014)
fit
## plot prediction error results
plot(fit, method = "line")
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.