fpca.ssvd | R Documentation |
Implements the algorithm of Huang, Shen, Buja (2008) for finding smooth right
singular vectors of a matrix X
containing (contaminated) evaluations of
functional random variables on a regular, equidistant grid. If the number of
smooth SVs to extract is not specified, the function hazards a guess for the
appropriate number based on the asymptotically optimal truncation threshold
under the assumption of a low rank matrix contaminated with i.i.d. Gaussian
noise with unknown variance derived in Donoho, Gavish (2013). Please note that
Donoho, Gavish (2013) should be regarded as experimental for functional PCA,
and will typically not work well if you have more observations than grid
points.
fpca.ssvd(
Y = NULL,
ydata = NULL,
argvals = NULL,
npc = NA,
center = TRUE,
maxiter = 15,
tol = 1e-04,
diffpen = 3,
gridsearch = TRUE,
alphagrid = 1.5^(-20:40),
lower.alpha = 1e-05,
upper.alpha = 1e+07,
verbose = FALSE,
integration = "trapezoidal"
)
Y |
data matrix (rows: observations; columns: grid of eval. points) |
ydata |
a data frame |
argvals |
the argument values of the function evaluations in |
npc |
how many smooth SVs to try to extract, if |
center |
center |
maxiter |
how many iterations of the power algorithm to perform at most (defaults to 15) |
tol |
convergence tolerance for power algorithm (defaults to 1e-4) |
diffpen |
difference penalty order controlling the desired smoothness of the right singular vectors, defaults to 3 (i.e., deviations from local quadratic polynomials). |
gridsearch |
use |
alphagrid |
grid of smoothing parameter values for grid search |
lower.alpha |
lower limit for for smoothing parameter if
|
upper.alpha |
upper limit for smoothing parameter if |
verbose |
generate graphical summary of progress and diagnostic messages?
defaults to |
integration |
ignored, see Details. |
Note that fpca.ssvd
computes smoothed orthonormal eigenvectors
of the supplied function evaluations (and associated scores), not (!)
evaluations of the smoothed orthonormal eigenfunctions. The smoothed
orthonormal eigenvectors are then rescaled by the length of the domain
defined by argvals
to have a quadratic integral approximately equal
to one (instead of crossproduct equal to one), so they approximate the behavior
of smooth eigenfunctions. If argvals
is not equidistant,
fpca.ssvd
will simply return the smoothed eigenvectors without rescaling,
with a warning.
an fpca
object like that returned from fpca.sc
,
with entries Yhat
, the smoothed trajectories, Y
, the observed
data, scores
, the estimated FPC loadings, mu
, the column means
of Y
(or a vector of zeroes if !center
), efunctions
,
the estimated smooth FPCs (note that these are orthonormal vectors, not
evaluations of orthonormal functions if argvals
is not equidistant),
evalues
, their associated eigenvalues, and npc
, the number of
smooth components that were extracted.
Fabian Scheipl
Huang, J. Z., Shen, H., and Buja, A. (2008). Functional principal components analysis via penalized rank one approximation. Electronic Journal of Statistics, 2, 678-695
Donoho, D.L., and Gavish, M. (2013). The Optimal Hard Threshold for Singular Values is 4/sqrt(3). eprint arXiv:1305.5870. Available from https://arxiv.org/abs/1305.5870.
fpca.sc
and fpca.face
for FPCA based on
smoothing a covariance estimate; fpca2s
for a faster SVD-based
approach.
## as in Sec. 6.2 of Huang, Shen, Buja (2008):
set.seed(2678695)
n <- 101
m <- 101
s1 <- 20
s2 <- 10
s <- 4
t <- seq(-1, 1, l=m)
v1 <- t + sin(pi*t)
v2 <- cos(3*pi*t)
V <- cbind(v1/sqrt(sum(v1^2)), v2/sqrt(sum(v2^2)))
U <- matrix(rnorm(n*2), n, 2)
D <- diag(c(s1^2, s2^2))
eps <- matrix(rnorm(m*n, sd=s), n, m)
Y <- U%*%D%*%t(V) + eps
smoothSV <- fpca.ssvd(Y, verbose=TRUE)
layout(t(matrix(1:4, nr=2)))
clrs <- sapply(rainbow(n), function(c)
do.call(rgb, as.list(c(col2rgb(c)/255, .1))))
matplot(V, type="l", lty=1, col=1:2, xlab="",
main="FPCs: true", bty="n")
matplot(smoothSV$efunctions, type="l", lty=1, col=1:5, xlab="",
main="FPCs: estimate", bty="n")
matplot(1:m, t(U%*%D%*%t(V)), type="l", lty=1, col=clrs, xlab="", ylab="",
main="true smooth Y", bty="n")
matplot(1:m, t(smoothSV$Yhat), xlab="", ylab="",
type="l", lty=1,col=clrs, main="estimated smooth Y", bty="n")
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.