ridge.cv | R Documentation |
This function computes the optimal ridge regression model based on cross-validation.
ridge.cv( X, y, lambda = NULL, scale = TRUE, k = 10, plot.it = FALSE, groups = NULL, method.cor = "pearson", compute.jackknife = TRUE )
X |
matrix of input observations. The rows of |
y |
vector of responses. The length of y must equal the number of rows of X |
lambda |
Vector of penalty terms. |
scale |
Scale the columns of X? Default is scale=TRUE. |
k |
Number of splits in |
plot.it |
Plot the cross-validation error as a function of
|
groups |
an optional vector with the same length as |
method.cor |
How should the correlation to the response be computed? Default is ”pearson”. |
compute.jackknife |
Logical. If |
Based on the regression coefficients coefficients.jackknife
computed
on the cross-validation splits, we can estimate their mean and their
variance using the jackknife. We remark that under a fixed design and the
assumption of normally distributed y
-values, we can also derive the
true distribution of the regression coefficients.
cv.error.matrix |
matrix of cross-validated errors based on mean squared error. A row corresponds to one cross-validation split. |
cv.error |
vector of cross-validated errors based on mean squared error |
lambda.opt |
optimal value of |
intercept |
intercept of the optimal model, based on mean squared error |
coefficients |
vector of regression coefficients of the optimal model, based on mean squared error |
cor.error.matrix |
matrix of cross-validated errors based on correlation. A row corresponds to one cross-validation split. |
cor.error |
vector of cross-validated errors based on correlation |
lambda.opt.cor |
optimal value of |
intercept.cor |
intercept of the optimal model, based on correlation |
coefficients.cor |
vector of regression coefficients of the optimal model, based on mean squared error |
coefficients.jackknife |
Array of
the regression coefficients on each of the cross-validation splits. The
dimension is |
Nicole Kraemer
pls.cv
, pcr.cv
,
benchmark.regression
n<-100 # number of observations p<-60 # number of variables X<-matrix(rnorm(n*p),ncol=p) y<-rnorm(n) ridge.object<-ridge.cv(X,y)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.