gplssvd: Generalized partial least squares singular value...

Description Usage Arguments Value Author(s) See Also Examples

View source: R/gplssvd.R

Description

gplssvd takes in left (XLW, YLW) and right (XRW, YRW) constraints (usually diagonal matrices, but any positive semi-definite matrix is fine) that are applied to each data matrix (X and Y), respectively. Right constraints for each data matrix are used for the orthogonality conditions.

Usage

1
gplssvd(X, Y, XLW, YLW, XRW, YRW, k = 0, tol = .Machine$double.eps)

Arguments

X

a data matrix

Y

a data matrix

XLW

X's Left Weights – the constraints applied to the left side (rows) of the X matrix and thus left singular vectors.

YLW

Y's Left Weights – the constraints applied to the left side (rows) of the Y matrix and thus left singular vectors.

XRW

X's Right Weights – the constraints applied to the right side (columns) of the X matrix and thus right singular vectors.

YRW

Y's Right Weights – the constraints applied to the right side (columns) of the Y matrix and thus right singular vectors.

k

total number of components to return though the full variance will still be returned (see d_full). If 0, the full set of components are returned.

tol

default is .Machine$double.eps. A tolerance level for eliminating effectively zero (small variance), negative, imaginary eigen/singular value components.

Value

A list with thirteen elements:

d_full

A vector containing the singular values of DAT above the tolerance threshold (based on eigenvalues).

l_full

A vector containing the eigen values of DAT above the tolerance threshold (tol).

d

A vector of length min(length(d_full), k) containing the retained singular values

l

A vector of length min(length(l_full), k) containing the retained eigen values

u

Left (rows) singular vectors. Dimensions are nrow(DAT) by k.

p

Left (rows) generalized singular vectors.

fi

Left (rows) component scores.

lx

Latent variable scores for rows of X

v

Right (columns) singular vectors.

q

Right (columns) generalized singular vectors.

fj

Right (columns) component scores.

ly

Latent variable scores for rows of Y

Author(s)

Derek Beaton

See Also

tolerance_svd, geigen and gsvd

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
 # Three "two-table" technique examples
 data(wine)
 X <- scale(wine$objective)
 Y <- scale(wine$subjective)

 ## Partial least squares (correlation)
 pls.res <- gplssvd(X, Y)


 ## Canonical correlation analysis (CCA)
 ### NOTE:
 #### This is not "traditional" CCA because of the generalized inverse.
 #### However the results are the same as standard CCA when data are not rank deficient.
 #### and this particular version uses tricks to minimize memory & computation
 cca.res <- gplssvd(
     X = MASS::ginv(t(X)),
     Y = MASS::ginv(t(Y)),
     XRW=crossprod(X),
     YRW=crossprod(Y)
 )


 ## Reduced rank regression (RRR) a.k.a. redundancy analysis (RDA)
 ### NOTE:
 #### This is not "traditional" RRR because of the generalized inverse.
 #### However the results are the same as standard RRR when data are not rank deficient.
 rrr.res <- gplssvd(
     X = MASS::ginv(t(X)),
     Y = Y,
     XRW=crossprod(X)
 )

derekbeaton/GSVD documentation built on Jan. 2, 2021, 9:21 p.m.