pls.cv | R Documentation |
This function computes the optimal model parameter using cross-validation.
pls.cv( X, y, k = 10, groups = NULL, m = ncol(X), use.kernel = FALSE, compute.covariance = FALSE, method.cor = "pearson" )
X |
matrix of predictor observations. |
y |
vector of response observations. The length of |
k |
number of cross-validation splits. Default is 10. |
groups |
an optional vector with the same length as |
m |
maximal number of Partial Least Squares components. Default is
|
use.kernel |
Use kernel representation? Default is
|
compute.covariance |
If |
method.cor |
How should the correlation to the response be computed? Default is ”pearson”. |
The data are centered and scaled to unit variance prior to the PLS
algorithm. It is possible to estimate the covariance matrix of the
cv-optimal regression coefficients (compute.covariance=TRUE
).
Currently, this is only implemented if use.kernel=FALSE
.
cv.error.matrix |
matrix of cross-validated errors based on mean squared error. A row corresponds to one cross-validation split. |
cv.error |
vector of cross-validated errors based on mean squared error |
m.opt |
optimal number of components based on mean squared error |
intercept |
intercept of the optimal model, based on mean squared error |
coefficients |
vector of regression coefficients of the optimal model, based on mean squared error |
cor.error.matrix |
matrix of cross-validated errors based on correlation. A row corresponds to one cross-validation split. |
cor.error |
vector of cross-validated errors based on correlation |
m.opt.cor |
optimal number of components based on correlation |
intercept.cor |
intercept of the optimal model, based on correlation |
coefficients.cor |
vector of regression coefficients of the optimal model, based on mean squared error |
covariance |
If
|
Nicole Kraemer, Mikio L. Braun
Kraemer, N., Sugiyama M. (2011). "The Degrees of Freedom of Partial Least Squares Regression". Journal of the American Statistical Association 106 (494) https://www.tandfonline.com/doi/abs/10.1198/jasa.2011.tm10107
Kraemer, N., Braun, M.L. (2007) "Kernelizing PLS, Degrees of Freedom, and Efficient Model Selection", Proceedings of the 24th International Conference on Machine Learning, Omni Press, 441 - 448
pls.model
, pls.ic
n<-50 # number of observations p<-5 # number of variables X<-matrix(rnorm(n*p),ncol=p) y<-rnorm(n) # compute linear PLS pls.object<-pls.cv(X,y,m=ncol(X)) # define random partioning groups<-sample(c("a","b","c"),n,replace=TRUE) pls.object1<-pls.cv(X,y,groups=groups)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.