tucker-methods | R Documentation |
The Tucker decomposition of a tensor. Approximates a K-Tensor using a
n-mode product of a core tensor (with modes specified by ranks
)
with orthogonal factor matrices. If there is no truncation in one of the modes,
then this is the same as the MPCA, mpca
.
If there is no truncation in all the modes (i.e. ranks = darr@modes
),
then this is the same as the HOSVD, hosvd
.
This is an iterative algorithm, with two possible stopping conditions:
either relative error in Frobenius norm has gotten below tol
,
or the max_iter
number of iterations has been reached.
For more details on the Tucker decomposition, consult Kolda and Bader (2009).
tucker(darr, ranks=NULL, max_iter=25, tol=1e-05)
## S4 method for signature 'DelayedArray'
tucker(darr, ranks, max_iter, tol)
darr |
Tensor with K modes |
ranks |
a vector of the modes of the output core Tensor |
max_iter |
maximum number of iterations if error stays above |
tol |
relative Frobenius norm error tolerance |
This function is an extension of the tucker
by DelayedArray.
Uses the Alternating Least Squares (ALS) estimation procedure also known as Higher-Order Orthogonal Iteration (HOOI). Intialized using a (Truncated-)HOSVD. A progress bar is included to help monitor operations on large tensors.
a list containing the following:
Z
the core tensor, with modes specified by ranks
U
a list of orthgonal factor matrices - one for each mode,
with the number of columns of the matrices given by ranks
conv
whether or not resid
< tol
by the last iteration
est
estimate of darr
after compression
norm_percent
the percent of Frobenius norm explained by the approximation
fnorm_resid
the Frobenius norm of the error
fnorm(est-darr)
all_resids
vector containing the Frobenius norm of error for all the iterations
The length of ranks
must match darr@num_modes
.
T. Kolda, B. Bader, "Tensor decomposition and applications". SIAM Applied Mathematics and Applications 2009.
hosvd
, mpca
library("DelayedRandomArray")
darr <- RandomUnifArray(c(2,3,4))
tucker(darr, ranks=c(1,2,3))
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.