plsreg1 | R Documentation |
The function plsreg1 performs Partial Least Squares Regression for the univariate case (i.e. one response variable)
plsreg1(predictors, response, comps = 2, crosval = TRUE)
predictors |
A numeric matrix or data frame with the predictor variables (which may contain missing data). |
response |
A numeric vector for the reponse variable. No missing data allowed. |
comps |
The number of extracted PLS components (2 by default). |
crosval |
Logical indicating whether
cross-validation should be performed ( |
The minimum number of PLS components (comps
) to be
extracted is 2.
The data is scaled to standardized values (mean=0, variance=1).
The argument crosval
gives the option to perform
cross-validation. This parameter takes into account how
comps
is specified. When comps=NULL
, the
number of components is obtained by cross-validation.
When a number of components is specified,
cross-validation results are calculated for each
component.
An object of class "plsreg1"
, basically a list
with the following elements:
x.scores |
PLS components (also known as T-components) |
x.loads |
loadings of the predictor variables |
y.scores |
scores of the response variable (also known as U-components) |
y.loads |
loadings of the response variable |
cor.xyt |
Correlations between the variables and the PLS components |
raw.wgs |
weights to calculate the PLS scores with the deflated matrices of predictor variables |
mod.wgs |
modified weights to calculate the PLS scores with the matrix of predictor variables |
std.coefs |
Vector of standardized regression coefficients |
reg.coefs |
Vector of regression coefficients (used with the original data scale) |
R2 |
Vector of PLS R-squared |
R2Xy |
explained variance of variables by PLS-components |
y.pred |
Vector of predicted values |
resid |
Vector of residuals |
T2 |
Table of Hotelling T2 values (used to detect atypical observations) |
Q2 |
Table with the cross validation results.
Includes: PRESS, RSS, Q2, and cummulated Q2. Only
available when |
Gaston Sanchez
Geladi, P., and Kowalski, B. (1986) Partial Least Squares Regression: A Tutorial. Analytica Chimica Acta, 185, pp. 1-17.
Tenenhaus, M. (1998) La Regression PLS. Theorie et Pratique. Editions TECHNIP, Paris.
Tenenhaus, M., Gauchi, J.-P., and Menardo, C. (1995) Regression PLS et applications. Revue de statistique appliquee, 43, pp. 7-63.
plot.plsreg1
, plsreg2
.
## Not run:
## example of PLSR1 with the vehicles dataset
# predictand variable: price of vehicles
data(vehicles)
# apply plsreg1 extracting 2 components (no cross-validation)
pls1_one = plsreg1(vehicles[,1:12], vehicles[,13,drop=FALSE], comps=2, crosval=FALSE)
# apply plsreg1 with selection of components by cross-validation
pls1_two = plsreg1(vehicles[,1:12], vehicles[,13,drop=FALSE], comps=NULL, crosval=TRUE)
# apply plsreg1 extracting 5 components with cross-validation
pls1_three = plsreg1(vehicles[,1:12], vehicles[,13,drop=FALSE], comps=5, crosval=TRUE)
# plot variables
plot(pls1_one)
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.