Description Usage Arguments Details Value
View source: R/solveRidgeRegression.R
This function solves a regression or logistic regression problem regularized
by a L2 or weighted L2 penalty. Contrary to lm.ridge
or glmnet
,
it works for any number of predictors.
1 2 3 4 5 6 7 8 |
x |
a matrix of covariates, one sample per row, one covariate per column. |
y |
a vector of response (continuous for regression, 0/1 binary for logistic regression) |
beta |
an initial solution where optimization starts (null vector by default) |
epsilon |
a scalar or vector of regularization parameters (default
|
family |
a string to choose the type of regression (default
|
offset |
a vector of offsets (default null vector) |
When family="gaussian"
, we solve the ridge regression problem
that finds the β that minimizes:
||y - x β||^2 + ε||β||^2/2 .
When family="binomial"
we solve the ridge
logistic regression problem
min ∑_i [-y_i (x β)_i + log(1+exp(xβ)_i)) ] + ε||β||^2/2 .
When epsilon
is a
vector of size equal to the size of beta
, then the penalty is a
weighted L2 norm ∑_i ε_i β_i^2 / 2.
A vector solution of the regression problem
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.