HDlars | R Documentation |
It performs the lars algorithm for solving lasso problem. It is a linear regression problem with a l1-penalty on the estimated coefficient.
HDlars(
X,
y,
maxSteps = 3 * min(dim(X)),
intercept = TRUE,
eps = .Machine$double.eps^0.5
)
X |
the matrix (of size n*p) of the covariates. |
y |
a vector of length n with the response. |
maxSteps |
Maximal number of steps for lars algorithm. |
intercept |
If TRUE, add an intercept to the model. |
eps |
Tolerance of the algorithm. |
The l1 penalty performs variable selection via shrinkage of the estimated coefficient. It depends on a penalty parameter called lambda controlling the amount of regularization. The objective function of lasso is :
||y-X\beta||_2 + \lambda||\beta||_1
An object of type LarsPath
.
Quentin Grimonprez
Efron, Hastie, Johnstone and Tibshirani (2003) "Least Angle Regression" (with discussion) Annals of Statistics
LarsPath
HDcvlars
listToMatrix
dataset <- simul(50, 10000, 0.4, 10, 50, matrix(c(0.1, 0.8, 0.02, 0.02), nrow = 2))
result <- HDlars(dataset$data, dataset$response)
# Obtain estimated coefficient in matrix format
coefficient <- listToMatrix(result)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.