Description Details Author(s) References See Also Examples
Implements the LS-PLS (least squares - partial least squares) method described in for instance J<c3><b8>rgensen, K., Segtnan, V. H., Thyholt, K., N<c3><a6>s, T. (2004) "A Comparison of Methods for Analysing Regression Models with Both Spectral and Designed Variables" Journal of Chemometrics, 18(10), 451–464, <doi:10.1002/cem.890>.
The DESCRIPTION file:
This package was not yet installed at build time.
Index: This package was not yet installed at build time.
LS-PLS (least squares–partial least squares) models are written on the form
Y = X β + T1 gamma1 + … + Tk gammak + E,
where the terms Ti are one or more matrices Zij separated by a colon (:), i.e., Zi1:Zi2:...:Zil. Multi-response models are possible, in wich case Y should be a matrix.
The model is fitted from left to right. First Y is fitted to X using least squares (LS) regression and the residuals calculated. For each i, the matrices Zi1, ..., Zil are orthogonalised against the variables used in the regression sofar (when i = 1, this means X). The residuals from the LS regression are used as the response in PLS regressions with the orthogonalised matrices as predictors (one PLS regression for each matrix), and the desired number of PLS components from each matrix are included among the LS prediction variables. The LS regression is then refit with the new variables, and new residuals calculated.
The function to fit LS-PLS models is lspls
. A typical
usage to fit the model
Y = Xβ + Z gamma + V1:V2 eta + W theta + E
would be
1 2 |
The first argument is the formula describing the model.
X
is fit first, using LS. Then PLS scores from Z
(orthogonalised) are added. Then PLS scores from V1
and
V2
are added (simultaneously), and finally PLS scores from
W
. The next argument, ncomp
, specifies the number of
components to use from each PLS: 3 Z
score vectors, 2 V1
score vectors, 1 V2
score vector and 2 W
score vectors.
Finally, mydata
should be a data frame with matrices y
,
X
, Z
, V1
, V2
and W
(for
single-response models, y
can be a vector).
Currently, score plots and loading plots of fitted models are
implemented. plot(mod, "scores")
gives score plots for each PLS
regression, and plot(mod, "loadings")
gives loading plots.
There is a predict
method to predict response or score values
from new data
1 | predict(mod, newdata = mynewdata)
|
(This predicts response values. Use type = "scores"
to get
scores.) Also, the standard functions resid
and fitted
can be used to extract the residuals and fitted values.
In order to determine the number of components to use from each matrix, one can use cross-validation:
1 2 |
In lsplsCv
, ncomp
gives the maximal number of components to
test. The argument segments
specifies the number of segments to
use. One can specify the type of segments to use (random (default),
consequtive or interleaved) with the argument segment.type
.
Alternatively, one can supply the segments explicitly with
segments
. See lsplsCv
for details.
One can plot cross-validated RMSEP values with plot(cvmod)
.
(Similarly, plot(cvmod, "MSEP")
plots MSEP values.) This makes
it easier to determine the optimal number of components for each PLS.
See plot.lsplsCv
for details. To calculate the RMSEP or
MSEP values explicitly, one can use the function RMSEP
or
MSEP
.
Bj<c3><b8>rn-Helge Mevik [aut, cre]
Maintainer: Bj<c3><b8>rn-Helge Mevik <b-h@mevik.net>
Jørgensen, K., Segtnan, V. H., Thyholt, K., Næs, T. (2004) A Comparison of Methods for Analysing Regression Models with Both Spectral and Designed Variables. Journal of Chemometrics, 18(10), 451–464.
Jørgensen, K., Mevik, B.-H., Næs, T. Combining Designed Experiments with Several Blocks of Spectroscopic Data. (Submitted)
Mevik, B.-H., Jørgensen, K., Måge, I., Næs, T. LS-PLS: Combining Categorical Design Variables with Blocks of Spectroscopic Measurements. (Submitted)
lspls
, lsplsCv
, plot.lspls
,
plot.lsplsCv
1 | ## FIXME
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.