do.pls | R Documentation |
Given two data sets, Partial Least Squares (PLS) aims at maximizing cross-covariance of latent variables for each data matrix,
therefore it can be considered as supervised methods. As we have two input matrices, do.pls
generates two sets of
outputs. Though it is widely used for regression problem, we used it in dimension reduction setting. For
algorithm aspects, we used recursive gram-schmidt orthogonalization in conjunction with extracting projection vectors under
eigen-decomposition formulation, as the problem dimension matters only up to original dimensionality.
For more details, see Wikipedia entry on PLS.
do.pls(data1, data2, ndim = 2)
data1 |
an (n\times N) data matrix whose rows are observations |
data2 |
an (n\times M) data matrix whose rows are observations |
ndim |
an integer-valued target dimension. |
a named list containing
an (n\times ndim) matrix of projected observations from data1
.
an (n\times ndim) matrix of projected observations from data2
.
an (N\times ndim) whose columns are loadings for data1
.
an (M\times ndim) whose columns are loadings for data2
.
a list containing information for out-of-sample prediction for data1
.
a list containing information for out-of-sample prediction for data2
.
a vector of eigenvalues for iterative decomposition.
Kisung You
wold_path_1975Rdimtools
\insertRefrosipal_overview_2006Rdimtools
do.cca
## generate 2 normal data matrices mat1 = matrix(rnorm(100*12),nrow=100)+10 # 12-dim normal mat2 = matrix(rnorm(100*6), nrow=100)-10 # 6-dim normal ## project onto 2 dimensional space for each data output = do.pls(mat1, mat2, ndim=2) ## visualize opar <- par(no.readonly=TRUE) par(mfrow=c(1,2)) plot(output$Y1, main="proj(mat1)") plot(output$Y2, main="proj(mat2)") par(opar)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.