The package implements the concave 1-norm and 2-norm group penalty in linear and logistic regression. The concave 1-norm group penalty includes 1-norm group SCAD and 1-norm group MCP. The concave 1-norm group penalty has bi-level selection features. That is it selects variables at group and individual levels with proper tuning parameters. The concave 1-norm group penalty is robust to mis-specified group information. The concave 2-norm group penalty includes 2-norm group SCAD and 2-norm group MCP. The concave 2-norm group penalty select variable at group level only. The package can also fit group Lasso, which is a special case of concave 2-norm group penalty when the regularization parameter kappa equals zero. The highly efficient (block) coordinate descent algorithm (CDA) is used to compute the solutions for both penalties in linear models. The highly stable and efficient (block) CDA and minimization-majorization approach are used to compute the solution for both penalties in logistic models. In the computation of solution surface, the solution path along kappa is implemented. This provides a better solution path compared to the solution path along lambda. The package also provides a tuning parameter selection method based on cross-validation for both linear and logistic models.

Author | Dingfeng Jiang <dingfengjiang at gmail.com> |

Date of publication | 2014-02-17 22:40:28 |

Maintainer | Dingfeng Jiang <dingfengjiang@gmail.com> |

License | GPL (>= 2) |

Version | 2.1-0 |

http://www.r-project.org |

grppenalty

grppenalty/src

grppenalty/src/Makevars

grppenalty/src/grppenalty.f

grppenalty/src/Makevars.win

grppenalty/NAMESPACE

grppenalty/R

grppenalty/R/grppenalty.R
grppenalty/MD5

grppenalty/DESCRIPTION

grppenalty/man

grppenalty/man/grppenalty.Rd
grppenalty/man/cv.plot.Rd
grppenalty/man/cv.grppenalty.Rd
grppenalty/man/path.plot.Rd
Questions? Problems? Suggestions? Tweet to @rdrrHQ or email at ian@mutexlabs.com.

Please suggest features or report bugs with the GitHub issue tracker.

All documentation is copyright its authors; we didn't write any of that.