A scalable implementation of the highly adaptive lasso algorithm, including routines for constructing sparse matrices of basis functions of the observed data, as well as a custom implementation of Lasso regression tailored to enhance efficiency when the matrix of predictors is composed exclusively of indicator functions. For ease of use and increased flexibility, the Lasso fitting routines invoke code from the 'glmnet' package by default. The highly adaptive lasso was first formulated and described by MJ van der Laan (2017) <doi:10.1515/ijb-2015-0097>, with practical demonstrations of its performance given by Benkeser and van der Laan (2016) <doi:10.1109/DSAA.2016.93>.
|Author||Jeremy Coyle [aut, cre] (<https://orcid.org/0000-0002-9874-6649>), Nima Hejazi [aut] (<https://orcid.org/0000-0002-7127-2789>), David Benkeser [ctb] (<https://orcid.org/0000-0002-1019-8343>), Oleg Sofrygin [ctb], Weixin Cai [ctb] (<https://orcid.org/0000-0003-2680-3066>), Mark van der Laan [aut, cph, ths] (<https://orcid.org/0000-0003-1432-5511>)|
|Maintainer||Jeremy Coyle <firstname.lastname@example.org>|
|Package repository||View on CRAN|
Install the latest version of this package by entering the following in R:
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.