Tucker Factor Analysis

Share:

Description

Given a 3-way array X = array(x,dim=c(I,J,K)), the 3-way Tucker model can be written as

X[i,j,k] = sum sum sum A[i,p]*B[j,q]*C[k,r]*G[p,q,r] + E[i,j,k]

where A = matrix(a,I,P) are the Mode A (first mode) weights, B = matrix(b,J,Q) are the Mode B (second mode) weights, C = matrix(c,K,R) are the Mode C (third mode) weights, G = array(g,dim=c(P,Q,R)) is the 3-way core array, and E = array(e,dim=c(I,J,K)) is the 3-way residual array. The summations are for p = seq(1,P), q = seq(1,Q), and r = seq(1,R).

Given a 4-way array X = array(x,dim=c(I,J,K,L)), the 4-way Tucker model can be written as

X[i,j,k,l] = sum sum sum sum A[i,p]*B[j,q]*C[k,r]*D[l,s]*G[p,q,r,s] + E[i,j,k,l]

where D = matrix(d,L,S) are the Mode D (fourth mode) weights, G = array(g,dim=c(P,Q,R,S)) is the 4-way residual array, E = array(e,dim=c(I,J,K,L)) is the 4-way residual array, and the other terms can be interprered as previously described.

Weight matrices are estimated using an alternating least squares algorithm.

Usage

1
2
3
4
5
tucker(X,nfac,nstart=10,Afixed=NULL,
       Bfixed=NULL,Cfixed=NULL,Dfixed=NULL,
       Bstart=NULL,Cstart=NULL,Dstart=NULL,
       maxit=500,ctol=10^-4,parallel=FALSE,
       cl=NULL,output=c("best","all"))

Arguments

X

Three-way data array with dim=c(I,J,K) or four-way data array with dim=c(I,J,K,L).

nfac

Number of factors in each mode.

nstart

Number of random starts.

Afixed

Fixed Mode A weights. Only used to fit model with fixed weights in Mode A.

Bfixed

Fixed Mode B weights. Only used to fit model with fixed weights in Mode B.

Cfixed

Fixed Mode C weights. Only used to fit model with fixed weights in Mode C.

Dfixed

Fixed Mode D weights. Only used to fit model with fixed weights in Mode D.

Bstart

Starting Mode B weights for ALS algorithm. Default uses random weights.

Cstart

Starting Mode C weights for ALS algorithm. Default uses random weights.

Dstart

Starting Mode D weights for ALS algorithm. Default uses random weights.

maxit

Maximum number of iterations.

ctol

Convergence tolerance.

parallel

Logical indicating if parLapply should be used. See Examples.

cl

Cluster created by makeCluster. Only used when parallel=TRUE.

output

Output the best solution (default) or output all nstart solutions.

Value

If output="best", returns an object of class "tucker" with the following elements:

A

Mode A weight matrix.

B

Mode B weight matrix.

C

Mode C weight matrix.

D

Mode D weight matrix.

G

Core array.

Rsq

R-squared value.

GCV

Generalized Cross-Validation.

edf

Effective degrees of freedom.

iter

Number of iterations.

cflag

Convergence flag.

Otherwise returns a list of length nstart where each element is an object of class "tucker".

Warnings

The ALS algorithm can perform poorly if the number of factors nfac is set too large.

Input matrices in Afixed, Bfixed, Cfixed, Dfixed, Bstart, Cstart, and Dstart must be columnwise orthonormal.

Note

Default use is 10 random strarts (nstart=10) with 500 maximum iterations of the ALS algorithm for each start (maxit=500) using a convergence tolerance of 10^-4 (ctol=10^-4). The algorithm is determined to have converged once the change in R^2 is less than or equal to ctol.

Output cflag gives convergence information: cflag=0 if ALS algorithm converged normally, and cflag=1 if maximum iteration limit was reached before convergence.

Author(s)

Nathaniel E. Helwig <helwig@umn.edu>

References

Kroonenberg, P. M., & de Leeuw, J. (1980). Principal component analysis of three-mode data by means of alternating least squares algorithms. Psychometrika, 45, 69-97.

Tucker, L. R. (1966). Some mathematical notes on three-mode factor analysis. Psychometrika, 31, 279-311.

Examples

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
##########   3-way example   ##########

# create random data array with Tucker structure
set.seed(3)
mydim <- c(50,20,5)
nf <- c(3,2,3)
Amat <- svd(matrix(rnorm(mydim[1]*nf[1]),mydim[1],nf[1]),nu=nf[1])$u
Bmat <- svd(matrix(rnorm(mydim[2]*nf[2]),mydim[2],nf[2]),nu=nf[2])$u
Cmat <- svd(matrix(rnorm(mydim[3]*nf[3]),mydim[3],nf[3]),nu=nf[3])$u
Gmat <- array(rnorm(prod(nf)),dim=nf)
Xmat <- array(tcrossprod(Amat%*%matrix(Gmat,nf[1],nf[2]*nf[3]),kronecker(Cmat,Bmat)),dim=mydim)
Emat <- array(rnorm(prod(mydim)),dim=mydim)
Emat <- nscale(Emat,0,sumsq(Xmat))   # SNR=1
X <- Xmat + Emat

# fit Tucker model
tuck <- tucker(X,nfac=nf,nstart=1)
tuck$Rsq

# check solution
Xhat <- fitted(tuck)
sum((Xmat-Xhat)^2)/prod(mydim)

# reorder mode="A"
tuck$A[1:4,]
tuck$G
tuck <- reorder(tuck, neworder=c(3,1,2), mode="A")
tuck$A[1:4,]
tuck$G
Xhat <- fitted(tuck)
sum((Xmat-Xhat)^2)/prod(mydim)

# reorder mode="B"
tuck$B[1:4,]
tuck$G
tuck <- reorder(tuck, neworder=2:1, mode="B")
tuck$B[1:4,]
tuck$G
Xhat <- fitted(tuck)
sum((Xmat-Xhat)^2)/prod(mydim)

# resign mode="C"
tuck$C[1:4,]
tuck <- resign(tuck, mode="C")
tuck$C[1:4,]
Xhat <- fitted(tuck)
sum((Xmat-Xhat)^2)/prod(mydim)


##########   4-way example   ##########

# create random data array with Tucker structure
set.seed(4)
mydim <- c(30,10,8,10)
nf <- c(2,3,4,3)
Amat <- svd(matrix(rnorm(mydim[1]*nf[1]),mydim[1],nf[1]),nu=nf[1])$u
Bmat <- svd(matrix(rnorm(mydim[2]*nf[2]),mydim[2],nf[2]),nu=nf[2])$u
Cmat <- svd(matrix(rnorm(mydim[3]*nf[3]),mydim[3],nf[3]),nu=nf[3])$u
Dmat <- svd(matrix(rnorm(mydim[4]*nf[4]),mydim[4],nf[4]),nu=nf[4])$u
Gmat <- array(rnorm(prod(nf)),dim=nf)
Xmat <- array(tcrossprod(Amat%*%matrix(Gmat,nf[1],prod(nf[2:4])),
                      kronecker(Dmat,kronecker(Cmat,Bmat))),dim=mydim)
Emat <- array(rnorm(prod(mydim)),dim=mydim)
Emat <- nscale(Emat,0,sumsq(Xmat))   # SNR=1
X <- Xmat + Emat

# fit Tucker model
tuck <- tucker(X,nfac=nf,nstart=1)
tuck$Rsq

# check solution
Xhat <- fitted(tuck)
sum((Xmat-Xhat)^2)/prod(mydim)


## Not run: 

##########   parallel computation   ##########

# create random data array with Tucker structure
set.seed(3)
mydim <- c(50,20,5)
nf <- c(3,2,3)
Amat <- svd(matrix(rnorm(mydim[1]*nf[1]),mydim[1],nf[1]),nu=nf[1])$u
Bmat <- svd(matrix(rnorm(mydim[2]*nf[2]),mydim[2],nf[2]),nu=nf[2])$u
Cmat <- svd(matrix(rnorm(mydim[3]*nf[3]),mydim[3],nf[3]),nu=nf[3])$u
Gmat <- array(rnorm(prod(nf)),dim=nf)
Xmat <- array(tcrossprod(Amat%*%matrix(Gmat,nf[1],nf[2]*nf[3]),kronecker(Cmat,Bmat)),dim=mydim)
Emat <- array(rnorm(prod(mydim)),dim=mydim)
Emat <- nscale(Emat,0,sumsq(Xmat))   # SNR=1
X <- Xmat + Emat

# fit Tucker model (10 random starts -- sequential computation)
set.seed(1)
system.time({tuck <- tucker(X,nfac=nf)})
tuck$Rsq

# fit Tucker model (10 random starts -- parallel computation)
set.seed(1)
cl <- makeCluster(detectCores())
ce <- clusterEvalQ(cl,library(multiway))
system.time({tuck <- tucker(X,nfac=nf,parallel=TRUE,cl=cl)})
tuck$Rsq
stopCluster(cl)

## End(Not run)