Description Usage Arguments Details Value References Examples
This function implements the blockwise coordinate descent algorithm
to get the nested reduced-rank regression estimator with
given rank values (r, rx, ry)
.
1 2 3 4 |
Y |
response matrix of dimension n-by-jy*d. |
X |
design matrix of dimension n-by-jx*p. |
Ag0 |
an initial estimator of matrix U. If |
Bg0 |
an initial estimator of matrix V. If |
rini, r |
rank of the local reduced-rank structure. |
rx |
number of latent predictors. |
ry |
number of latent responses. |
jx |
number of basis functions to expand the functional predictor. |
jy |
number of basis functions to expand the functional response. |
p |
number of predictors. |
d |
number of responses. |
n |
sample size. |
maxiter |
the maximum iteration number of the blockwise coordinate descent algorithm. Default is 100. |
conv |
the tolerance level used to control the convergence of the blockwise coordinate descent algorithm. Default is 1e-4. |
method |
'RRR' (default): no additional ridge penalty; 'RRS': add an additional ridge penalty. |
lambda |
the tuning parameter to control the amount of ridge
penalization. It is only used when |
The nested reduced-rank regression (NRRR) is proposed to solve a multivariate functional linear regression problem where both the response and predictor are multivariate and functional (i.e., Y(t)=(y_1(t),...,y_d(t))^T and X(s)=(x_1(s),...,y_p(s))^T). To control the complexity of the problem, NRRR imposes a nested reduced-rank structure on the regression surface C(s,t). Specifically, a global dimension reduction makes use of the correlation within the components of multivariate response and multivariate predictor. Matrices U (d-by-ry) and V (p-by-rx) provide weights to form ry latent functional responses and rx latent functional predictors, respectively. Dimension reduction is achieved once ry < d or rx < p. Then, a local dimension reduction is conducted by restricting the latent regression surface C^*(s,t) to be of low-rank. After basis expansion and truncation, also by applying proper rearrangement to columns and rows of the resulting data matrices and coefficient matrices, we have the nested reduced-rank problem:
\min_{C} || Y - XC ||_F^2, s.t., C = (I_{jx} \otimes V) BA^T (I_{jy} \otimes U)^T,
where BA^T is a full-rank decomposition to control the local rank and jx, jy are the number of basis functions. Beyond the functional setup, this structure can also be applied to multiple scenarios, including multivariate time series autoregression analysis and tensor-on-tensor regression. This problem is non-convex and has no explicit solution, thus we use a blockwise coordinate descent algorithm to find a local solution. The convergence is decided by the change in objective values between two neighboring iterations.
The function returns a list:
Ag |
the estimated U. |
Bg |
the estimated V. |
Al |
the estimated A. |
Bl |
the estimated B. |
C |
the estimated coefficient matrix C. |
df |
degrees of freedom of the model. |
sse |
sum of squared errors. |
ic |
a vector containing values of BIC, BICP, AIC, GCV. |
iter |
the number of iterations needed to converge. |
Liu, X., Ma, S., & Chen, K. (2020). Multivariate Functional Regression via Nested Reduced-Rank Regularization. arXiv: Methodology.
1 2 3 4 5 6 7 8 9 | library(NRRR)
simDat <- NRRR.sim(n = 100, ns = 200, nt = 200, r = 5, rx = 3, ry = 3,
jx = 15, jy = 15, p = 10, d = 6, s2n = 1, rho_X = 0.5,
rho_E = 0, Sigma = "CorrAR")
fit_init <- with(simDat, NRRR.est(Y = Yest, X = Xest,
Ag0 = NULL, Bg0 = NULL, rini = 5, r = 5,
rx = 3, ry = 3, jx = 15, jy = 15,
p = 10, d = 6, n = 100))
fit_init$Ag
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.