solveRidgeRegression: Solve ridge regression or logistic regression problems

View source: R/solveRidgeRegression.R

solveRidgeRegressionR Documentation

Solve ridge regression or logistic regression problems

Description

This function solves a regression or logistic regression problem regularized by a L2 or weighted L2 penalty. Contrary to lm.ridge or glmnet, it works for any number of predictors.

Usage

solveRidgeRegression(
  x,
  y,
  beta = rep(0, NCOL(x)),
  epsilon = 1e-06,
  family = c("gaussian", "binomial"),
  offset = rep(0, NROW(x))
)

Arguments

x

a matrix of covariates, one sample per row, one covariate per column.

y

a vector of response (continuous for regression, 0/1 binary for logistic regression)

beta

an initial solution where optimization starts (null vector by default)

epsilon

a scalar or vector of regularization parameters (default 1e-6)

family

a string to choose the type of regression (default family="gaussian")

offset

a vector of offsets (default null vector)

Details

When family="gaussian", we solve the ridge regression problem that finds the \beta that minimizes:

||y - x \beta||^2 + \epsilon||\beta||^2/2 .

When family="binomial" we solve the ridge logistic regression problem

min \sum_i [-y_i (x \beta)_i + log(1+exp(x\beta)_i)) ] + \epsilon||\beta||^2/2 .

When epsilon is a vector of size equal to the size of beta, then the penalty is a weighted L2 norm \sum_i \epsilon_i \beta_i^2 / 2.

Value

A vector solution of the regression problem


drisso/zinbwave documentation built on March 18, 2024, 5:13 p.m.