ridge: Ridge regression with penalty parameter selection

Description Usage Arguments Value Author(s) References See Also Examples

View source: R/ridge.R

Description

Fit ridge regression models and select the penalty parameter by estimating the respective prediction error via (repeated) K-fold cross-validation, (repeated) random splitting (also known as random subsampling or Monte Carlo cross-validation), or the bootstrap.

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
ridge(
  x,
  y,
  lambda,
  standardize = TRUE,
  intercept = TRUE,
  splits = foldControl(),
  cost = rmspe,
  selectBest = c("hastie", "min"),
  seFactor = 1,
  ncores = 1,
  cl = NULL,
  seed = NULL,
  ...
)

ridge.fit(x, y, lambda, standardize = TRUE, intercept = TRUE, ...)

Arguments

x

a numeric matrix containing the predictor variables.

y

a numeric vector containing the response variable.

lambda

a numeric vector of non-negative values to be used as penalty parameter.

standardize

a logical indicating whether the predictor variables should be standardized to have unit variance (the default is TRUE).

intercept

a logical indicating whether a constant term should be included in the model (the default is TRUE).

splits

an object giving data splits to be used for prediction error estimation (see perryTuning).

cost

a cost function measuring prediction loss (see perryTuning for some requirements). The default is to use the root mean squared prediction error (see cost).

selectBest, seFactor

arguments specifying a criterion for selecting the best model (see perryTuning). The default is to use a one-standard-error rule.

ncores, cl

arguments for parallel computing (see perryTuning).

seed

optional initial seed for the random number generator (see .Random.seed and perryTuning).

...

for ridge, additional arguments to be passed to the prediction loss function cost. For ridge.fit, additional arguments are currently ignored.

Value

For ridge, an object of class "perryTuning", see perryTuning). It contains information on the prediction error criterion, and includes the final model with the optimal tuning paramter as component finalModel.

For ridge.fit, an object of class ridge with the following components:

lambda

a numeric vector containing the values of the penalty parameter.

coefficients

a numeric vector or matrix containing the coefficient estimates.

fitted.values

a numeric vector or matrix containing the fitted values.

residuals

a numeric vector or matrix containing the residuals.

standardize

a logical indicating whether the predictor variables were standardized to have unit variance.

intercept

a logical indicating whether the model includes a constant term.

muX

a numeric vector containing the means of the predictors.

sigmaX

a numeric vector containing the standard deviations of the predictors.

muY

numeric; the mean of the response.

call

the matched function call.

Author(s)

Andreas Alfons

References

Hoerl, A.E. and Kennard, R.W. (1970) Ridge regression: biased estimation for nonorthogonal problems. Technometrics, 12(1), 55–67.

See Also

perryTuning

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
## load data
data("Bundesliga")
Bundesliga <- Bundesliga[, -(1:2)]
f <- log(MarketValue) ~ Age + I(Age^2) + .
mf <- model.frame(f, data=Bundesliga)
x <- model.matrix(terms(mf), mf)[, -1]
y <- model.response(mf)

## set up repeated random splits
splits <- splitControl(m = 40, R = 10)

## select optimal penalty parameter
lambda <- seq(600, 0, length.out = 50)
fit <- ridge(x, y, lambda = lambda, splits = splits, seed = 2014)
fit

## plot prediction error results
plot(fit, method = "line")

perryExamples documentation built on Nov. 3, 2021, 5:07 p.m.