solveRidgeRegression: Solve ridge regression or logistic regression problems

Description Usage Arguments Details Value

View source: R/solveRidgeRegression.R

Description

This function solves a regression or logistic regression problem regularized by a L2 or weighted L2 penalty. Contrary to lm.ridge or glmnet, it works for any number of predictors.

Usage

1
2
3
4
5
6
7
8
solveRidgeRegression(
  x,
  y,
  beta = rep(0, NCOL(x)),
  epsilon = 1e-06,
  family = c("gaussian", "binomial"),
  offset = rep(0, NROW(x))
)

Arguments

x

a matrix of covariates, one sample per row, one covariate per column.

y

a vector of response (continuous for regression, 0/1 binary for logistic regression)

beta

an initial solution where optimization starts (null vector by default)

epsilon

a scalar or vector of regularization parameters (default 1e-6)

family

a string to choose the type of regression (default family="gaussian")

offset

a vector of offsets (default null vector)

Details

When family="gaussian", we solve the ridge regression problem that finds the β that minimizes:

||y - x β||^2 + ε||β||^2/2 .

When family="binomial" we solve the ridge logistic regression problem

min ∑_i [-y_i (x β)_i + log(1+exp(xβ)_i)) ] + ε||β||^2/2 .

When epsilon is a vector of size equal to the size of beta, then the penalty is a weighted L2 norm ∑_i ε_i β_i^2 / 2.

Value

A vector solution of the regression problem


zinbwave documentation built on Nov. 8, 2020, 8:11 p.m.