Definition of options such as bounds on the Hessian, convergence criteria and output management for the Group Lasso algorithm.
1 2 3 4 5 |
save.x |
a logical indicating whether the design matrix should be saved. |
update.hess |
should the hessian be updated in each iteration ("always")? update.hess = "lambda" will update the Hessian once for each component of the penalty parameter "lambda" based on the parameter estimates corresponding to the previous value of the penalty parameter. |
update.every |
Only used if update.hess = "lambda". E.g. set to 3 if you want to update the Hessian only every third grid point. |
inner.loops |
How many loops should be done (at maximum) when solving only the active set (without considering the remaining predictors). Useful if the number of predictors is large. Set to 0 if no inner loops should be performed. |
line.search |
Should line searches be performed? |
max.iter |
Maximal number of loops through all groups |
tol |
convergence tolerance; the smaller the more precise, see details below. |
lower |
lower bound for the diagonal approximation of the corresponding block submatrix of the Hessian of the negative log-likelihood function. |
upper |
upper bound for the diagonal approximation of the corresponding block submatrix of the Hessian of the negative log-likelihood function. |
beta |
scaling factor β < 1 of the Armijo line search. |
sigma |
0 < σ < 1 used in the Armijo line search. |
trace |
integer. |
For the convergence criteria see chapter 8.2.3.2 of Gill et al. (1981).
An object of class lassoControl
.
Philip E. Gill, Walter Murray and Margaret H. Wright (1981) Practical Optimization, Academic Press.
Dimitri P. Bertsekas (2003) Nonlinear Programming, Athena Scientific.
Questions? Problems? Suggestions? Tweet to @rdrrHQ or email at ian@mutexlabs.com.
Please suggest features or report bugs with the GitHub issue tracker.
All documentation is copyright its authors; we didn't write any of that.