View source: R/solveRidgeRegression.R
solveRidgeRegression | R Documentation |
This function solves a regression or logistic regression problem regularized
by a L2 or weighted L2 penalty. Contrary to lm.ridge
or glmnet
,
it works for any number of predictors.
solveRidgeRegression(
x,
y,
beta = rep(0, NCOL(x)),
epsilon = 1e-06,
family = c("gaussian", "binomial"),
offset = rep(0, NROW(x))
)
x |
a matrix of covariates, one sample per row, one covariate per column. |
y |
a vector of response (continuous for regression, 0/1 binary for logistic regression) |
beta |
an initial solution where optimization starts (null vector by default) |
epsilon |
a scalar or vector of regularization parameters (default
|
family |
a string to choose the type of regression (default
|
offset |
a vector of offsets (default null vector) |
When family="gaussian"
, we solve the ridge regression problem
that finds the \beta
that minimizes:
||y - x \beta||^2 +
\epsilon||\beta||^2/2 .
When family="binomial"
we solve the ridge
logistic regression problem
min \sum_i [-y_i (x \beta)_i +
log(1+exp(x\beta)_i)) ] + \epsilon||\beta||^2/2 .
When epsilon
is a
vector of size equal to the size of beta
, then the penalty is a
weighted L2 norm \sum_i \epsilon_i \beta_i^2 / 2
.
A vector solution of the regression problem
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.