sgpls: Sparse Generalized Projection to Latent Structures

Description Usage Arguments References

View source: R/PLS.R

Description

Sparse projection to latent structures (partial least squares) is an extension of PLS that takes advantage of L1 regularization via a LARS-like algorithm to select predictors. Predictors which do not load highly on any of the latent variables get dropped from the model, and the corresponding regression estimates are shrunk to zero. Here sparse PLS is extended to include generalized linear models. Only univariate outcomes are supported here.

Usage

1
2
3
4
5
6
7
8
9
sgpls(
  formula,
  data,
  ncomp = 2,
  lambda = 0.01,
  family = "gaussian",
  link = "identity",
  scale = T
)

Arguments

formula

model formula

data

a data frame

ncomp

number of components to retain

lambda

a regularization parameter.

family

"gaussian", "poisson", "negative.binomial", "binomial", "Gamma", or "inverse.gaussian"

link

the link function. see details for available options.

References

Chun, H., & Keles, S. (2010). Sparse partial least squares regression for simultaneous dimension reduction and variable selection. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 72(1), 3–25. doi:10.1111/j.1467-9868.2009.00723.x


abnormally-distributed/cvreg documentation built on May 3, 2020, 3:45 p.m.