Description Usage Arguments Details Value Author(s) References See Also Examples
Implementation of robust SPCA, using variable projection as an optimization strategy.
1 2 3 |
X |
array_like; |
k |
integer; |
alpha |
float; |
beta |
float; |
gamma |
float; |
center |
bool; |
scale |
bool; |
max_iter |
integer; |
tol |
float; |
verbose |
bool; |
Sparse principal component analysis is a modern variant of PCA. Specifically, SPCA attempts to find sparse weight vectors (loadings), i.e., a weight vector with only a few 'active' (nonzero) values. This approach leads to an improved interpretability of the model, because the principal components are formed as a linear combination of only a few of the original variables. Further, SPCA avoids overfitting in a high-dimensional data setting where the number of variables p is greater than the number of observations n.
Such a parsimonious model is obtained by introducing prior information like sparsity promoting regularizers. More concreatly, given an (n,p) data matrix X, robust SPCA attemps to minimize the following objective function:
f(A,B) = \frac{1}{2} \| X - X B A^\top - S \|^2_F + ψ(B) + γ \|S\|_1
where B is the sparse weight matrix (loadings) and A is an orthonormal matrix. ψ denotes a sparsity inducing regularizer such as the LASSO (l1 norm) or the elastic net (a combination of the l1 and l2 norm). The matrix S captures grossly corrupted outliers in the data.
The principal components Z are formed as
Z = X B
and the data can be approximately rotated back as
Xtilde = Z t(A)
The print and summary method can be used to present the results in a nice format.
spca
returns a list containing the following three components:
loadings |
array_like; |
transform |
array_like; |
scores |
array_like; |
sparse |
array_like; |
eigenvalues |
array_like; |
center, scale |
array_like; |
N. Benjamin Erichson, Peng Zheng, and Sasha Aravkin
[1] N. B. Erichson, P. Zheng, K. Manohar, S. Brunton, J. N. Kutz, A. Y. Aravkin. "Sparse Principal Component Analysis via Variable Projection." Submitted to IEEE Journal of Selected Topics on Signal Processing (2018). (available at 'arXiv https://arxiv.org/abs/1804.00341).
1 2 3 4 5 6 7 8 9 10 11 12 13 | # Create artifical data
m <- 10000
V1 <- rnorm(m, 0, 290)
V2 <- rnorm(m, 0, 300)
V3 <- -0.1*V1 + 0.1*V2 + rnorm(m,0,100)
X <- cbind(V1,V1,V1,V1, V2,V2,V2,V2, V3,V3)
X <- X + matrix(rnorm(length(X),0,1), ncol = ncol(X), nrow = nrow(X))
# Compute SPCA
out <- robspca(X, k=3, alpha=1e-3, beta=1e-5, gamma=5, center = TRUE, scale = FALSE, verbose=0)
print(out)
summary(out)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.