PomaLasso: Lasso, Ridge, and Elasticnet Regularized Generalized Linear...

View source: R/PomaLasso.R

PomaLassoR Documentation

Lasso, Ridge, and Elasticnet Regularized Generalized Linear Models for Binary Outcomes

Description

PomaLasso performs LASSO, Ridge, and Elasticnet regression for feature selection and prediction purposes for binary outcomes.

Usage

PomaLasso(
  data,
  alpha = 1,
  ntest = NULL,
  nfolds = 10,
  lambda = NULL,
  labels = FALSE
)

Arguments

data

A SummarizedExperiment object.

alpha

Numeric. Indicates the elasticnet mixing parameter. alpha = 1 is the LASSO penalty and alpha = 0 is the Ridge penalty.

ntest

Numeric. Indicates the percentage of observations that will be used as test set. Default is NULL (no test set).

nfolds

Numeric. Indicates number of folds for cross-validation (default is 10). Although nfolds can be as large as the sample size (leave-one-out CV), it is not recommended for large datasets. Smallest value allowable is nfolds = 3.

lambda

Numeric. Indicates the user supplied lambda sequence. Typical usage is to have the program compute its own lambda sequence based on nlambda and lambda.min.ratio. See ?glmnet::glmnet().

labels

Logical. Indicates if feature names should be plotted in coefficient plot or not. Default is FALSE.

Value

A list with results.

Author(s)

Pol Castellano-Escuder

References

Jerome Friedman, Trevor Hastie, Robert Tibshirani (2010). Regularization Paths for Generalized Linear Models via Coordinate Descent. Journal of Statistical Software, 33(1), 1-22. URL http://www.jstatsoft.org/v33/i01/.

Examples

data("st000336")

# lasso
st000336 %>%
  PomaImpute() %>%
  PomaNorm() %>%
  PomaLasso()

# elasticnet
st000336 %>%
  PomaImpute() %>%
  PomaNorm() %>%
  PomaLasso(alpha = 0.5)

# ridge
st000336 %>%
  PomaImpute() %>%
  PomaNorm() %>%
  PomaLasso(alpha = 0)

pcastellanoescuder/POMA documentation built on March 15, 2024, 10:08 p.m.