Description Usage Arguments Details Value Author(s) References See Also Examples
fit a low-rank svd to a complete matrix by alternating orthogonal ridge regression. Special sparse-matrix classes available for very large matrices, including "SparseplusLowRank" versions for row and column centered sparse matrices.
1 2 |
x |
An m by n matrix. Large matrices can be in "sparseMatrix" format, as
well as "SparseplusLowRank". The latter arise after centering sparse
matrices, for example with |
rank.max |
The maximum rank for the solution. This is also the dimension of the left and right matrices of orthogonal singular vectors. 'rank.max' should be no bigger than 'min(dim(x)'. |
lambda |
The regularization parameter. |
thresh |
convergence threshold, measured as the relative changed in the Frobenius norm between two successive estimates. |
maxit |
maximum number of iterations. |
trace.it |
with |
warm.start |
an svd object can be supplied as a warm start. If the solution requested has higher rank than the warm start, the additional subspace is initialized with random Gaussians (and then orthogonalized wrt the rest). |
final.svd |
Although in theory, this algorithm converges to the solution to a
nuclear-norm regularized low-rank matrix approximation problem,
with potentially some singular values equal to zero, in practice only
near-zeros are achieved. This final step does one more iteration with
|
This algorithm solves the problem
\min ||X-M||_F^2 +λ ||M||_*
subject to rank(M)≤q r, where ||M||_* is the nuclear norm of M (sum of singular values). It achieves this by solving the related problem
\min ||X-AB'||_F^2 +λ/2 (||A||_F^2+||B||_F^2)
subject to rank(A)=rank(B)≤q r. The solution is a rank-restricted, soft-thresholded SVD of X.
An svd object is returned, with components "u", "d", and "v".
u |
an m by |
d |
a vector of length |
v |
an n by |
Trevor Hastie, Rahul Mazumder
Maintainer: Trevor Hastie hastie@stanford.edu
Rahul Mazumder, Trevor Hastie and Rob Tibshirani (2010)
Spectral Regularization Algorithms for Learning Large Incomplete
Matrices,
https://web.stanford.edu/~hastie/Papers/mazumder10a.pdf
Journal of Machine Learning Research 11 (2010) 2287-2322
biScale
, softImpute
, Incomplete
,
lambda0
, impute
, complete
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 | #create a matrix and run the algorithm
set.seed(101)
n=100
p=50
J=25
np=n*p
x=matrix(rnorm(n*J),n,J)%*%matrix(rnorm(J*p),J,p)+matrix(rnorm(np),n,p)/5
fit=svd.als(x,rank=25,lambda=50)
fit$d
pmax(svd(x)$d-50,0)
# now create a sparse matrix and do the same
nnz=trunc(np*.3)
inz=sample(seq(np),nnz,replace=FALSE)
i=row(x)[inz]
j=col(x)[inz]
x=rnorm(nnz)
xS=sparseMatrix(x=x,i=i,j=j)
fit2=svd.als(xS,rank=20,lambda=7)
fit2$d
pmax(svd(as.matrix(xS))$d-7,0)
|
Loading required package: Matrix
Loaded softImpute 1.4
[1] 71.321854 68.329217 62.521930 53.109888 47.263661 34.334098 31.475900
[8] 29.007029 24.555327 18.094476 16.027366 13.750123 7.635721 1.869545
[15] 0.000000
[1] 71.321854 68.329217 62.521930 53.109888 47.263661 34.334098 31.475900
[8] 29.007029 24.555327 18.094476 16.027366 13.750123 7.635721 1.869545
[15] 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
[22] 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
[29] 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
[36] 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
[43] 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
[50] 0.000000
[1] 2.9007188 2.2274769 1.9361988 1.4107149 1.3201686 0.8740902 0.6968810
[8] 0.5930401 0.3545692 0.1514878 0.0000000 0.0000000 0.0000000 0.0000000
[15] 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000
[1] 2.9007188 2.2274769 1.9361988 1.4107149 1.3201686 0.8740902 0.6968810
[8] 0.5930401 0.3545692 0.1514878 0.0000000 0.0000000 0.0000000 0.0000000
[15] 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000
[22] 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000
[29] 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000
[36] 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000
[43] 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000 0.0000000
[50] 0.0000000
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.