Description Usage Arguments Details Value Author(s) References See Also Examples
Starting from zero, the LARS-EN algorithm provides the entire sequence of coefficients and fits.
1 2 |
x |
matrix of predictors |
y |
response |
lambda |
Quadratic penalty parameter. lambda=0 performs the Lasso fit. |
max.steps |
Limit the number of steps taken; the default is |
trace |
If TRUE, prints out its progress |
normalize |
Standardize the predictors? |
intercept |
Center the predictors? |
eps |
An effective zero |
The Elastic Net methodology is described in detail in Zou and Hastie (2004). The LARS-EN algorithm computes the complete elastic net solution simultaneously for ALL values of the shrinkage parameter in the same computational cost as a least squares fit. The structure of enet() is based on lars() coded by Efron and Hastie. Some internel functions from the lars package are called. The user should install lars before using elasticnet functions.
An "enet" object is returned, for which print, plot and predict methods exist.
Hui Zou and Trevor Hastie
Zou and Hastie (2005) "Regularization and Variable Selection via the Elastic Net" Journal of the Royal Statistical Society, Series B, 67, 301-320.
print, plot, and predict methods for enet
1 2 3 4 5 6 7 8 9 10 11 12 | data(diabetes)
attach(diabetes)
##fit the lasso model (treated as a special case of the elastic net)
object1 <- enet(x,y,lambda=0)
plot(object1)
##fit the elastic net model with lambda=1.
object2 <- enet(x,y,lambda=1)
plot(object2)
##early stopping after 50 LARS-EN steps
object4 <- enet(x2,y,lambda=0.5,max.steps=50)
plot(object4)
detach(diabetes)
|
Loading required package: lars
Loaded lars 1.2
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.