HDlars: Lars algorithm

View source: R/lars.R

HDlarsR Documentation

Lars algorithm

Description

It performs the lars algorithm for solving lasso problem. It is a linear regression problem with a l1-penalty on the estimated coefficient.

Usage

HDlars(
  X,
  y,
  maxSteps = 3 * min(dim(X)),
  intercept = TRUE,
  eps = .Machine$double.eps^0.5
)

Arguments

X

the matrix (of size n*p) of the covariates.

y

a vector of length n with the response.

maxSteps

Maximal number of steps for lars algorithm.

intercept

If TRUE, add an intercept to the model.

eps

Tolerance of the algorithm.

Details

The l1 penalty performs variable selection via shrinkage of the estimated coefficient. It depends on a penalty parameter called lambda controlling the amount of regularization. The objective function of lasso is :

||y-X\beta||_2 + \lambda||\beta||_1

Value

An object of type LarsPath.

Author(s)

Quentin Grimonprez

References

Efron, Hastie, Johnstone and Tibshirani (2003) "Least Angle Regression" (with discussion) Annals of Statistics

See Also

LarsPath HDcvlars listToMatrix

Examples

dataset <- simul(50, 10000, 0.4, 10, 50, matrix(c(0.1, 0.8, 0.02, 0.02), nrow = 2))
result <- HDlars(dataset$data, dataset$response)
# Obtain estimated coefficient in matrix format
coefficient <- listToMatrix(result)

HDPenReg documentation built on March 31, 2023, 9:31 p.m.