Description Usage Arguments Details Value Author(s) See Also Examples
This function estimates the vector of regression coefficient under sparsity constraints, by square-root Lasso. That is, beta which minimize
|Y-X*beta|_2 + lambda |beta|_1.
1 | sqR_Lasso(X, Y, lambda, solver = 'CD', sto = '0')
|
X |
The matrix of explanatory variables (must be a double-precision matrix). |
Y |
The response variable. |
lambda |
The penalization parameter. |
solver |
The solver. A string indicating the solver to use.
The default is |
sto |
Indicates whether a randomized algorithm (stochastic coordinate descent) have to be used when choosing the coordinate descent method. By default, this parameter is set to '0', that means that the coordinates are updated in the order in which the corresponding variables appear in X. Another option would be '2', the coordinates are all updated but in a uniformly random order. The last option (experimental) would be '1', in this case the sole coordinate to be updated is chosen uniformly at random at each iteration. |
This method can use the Mosek solver, the Gurobi solver or (by default) the SCS solver.
The coefficient vector.
Arnak Dalalyan and Samuel Balmand.
1 2 3 4 5 6 7 8 9 | ## set the design matrix
X <- matrix(c(1,0,2,2,1,0,-1,1,1,2,0,1),4,3,byrow=TRUE)
## set the vector of observations
Y <- c(1,0,2,1)
## set the penalty level
lambda <- 1
## compute the square-root Lasso estimate using SCS
## get beta, the vector of the coefficients of regression
sqR_Lasso(X, Y, lambda, solver="SCS")
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.