HDlars: Lars algorithm

Description Usage Arguments Details Value Author(s) References See Also Examples

Description

It performs the lars algorithm for solving lasso problem. It is a linear regression problem with a l1-penalty on the estimated coefficient.

Usage

1
2
HDlars(X, y, maxSteps = 3 * min(dim(X)), intercept = TRUE,
  eps = .Machine$double.eps^0.5)

Arguments

X

the matrix (of size n*p) of the covariates.

y

a vector of length n with the response.

maxSteps

Maximal number of steps for lars algorithm.

intercept

If TRUE, add an intercept to the model.

eps

Tolerance of the algorithm.

Details

The l1 penalty performs variable selection via shrinkage of the estimated coefficient. It depends on a penalty parameter called lambda controlling the amount of regularization. The objective function of lasso is :

||y-Xβ||_2 + λ||β||_1

Value

An object of type LarsPath.

Author(s)

Quentin Grimonprez

References

Efron, Hastie, Johnstone and Tibshirani (2003) "Least Angle Regression" (with discussion) Annals of Statistics

See Also

LarsPath HDcvlars listToMatrix

Examples

1
2
3
4
dataset=simul(50,10000,0.4,10,50,matrix(c(0.1,0.8,0.02,0.02),nrow=2))
result=HDlars(dataset$data,dataset$response)
# Obtain estimated coefficient in matrix format
coefficient = listToMatrix(result)

HDPenReg documentation built on May 2, 2019, 6:09 p.m.