Description Usage Arguments Details Value Note References Examples
Implements matrix decomposition by the stochastic gradient descent optimization popularized by Simon Funk to minimize the error on the known values.
1 2 3 |
x |
a matrix, potentially containing NAs. |
k |
number of features (i.e, rank of the approximation). |
gamma |
regularization term. |
lambda |
learning rate. |
min_improvement |
required minimum improvement per iteration. |
min_epochs |
minimum number of iterations per feature. |
max_epochs |
maximum number of iterations per feature. |
verbose |
show progress. |
Funk SVD decomposes a matrix (with missing values) into two components U and V. The singular values are folded into these matrices. The approximation for the original matrix can be obtained by R = UV'.
This function predict
in this implementation folds in new data rows
by estimating the u vectors using gradient descend and then calculating
the reconstructed complete matrix r for these users via r = uV'.
An object of class "funkSVD"
with components
U |
the U matrix. |
V |
the V matrix. |
parameters |
a list with parameter values. |
The code is based on the implmentation in package rrecsys by Ludovik Coba and Markus Zanker.
Y. Koren, R. Bell, and C. Volinsky. Matrix Factorization Techniques for Recommender Systems, IEEE Computer, pp. 42-49, August 2009.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 | ### this takes a while to run
## Not run:
data("Jester5k")
train <- as(Jester5k[1:100], "matrix")
fsvd <- funkSVD(train, verbose = TRUE)
### reconstruct the rating matrix as R = UV'
### and calculate the root mean square error on the known ratings
r <- tcrossprod(fsvd$U, fsvd$V)
rmse(train, r)
### fold in new users for matrix completion
test <- as(Jester5k[101:105], "matrix")
p <- predict(fsvd, test, verbose = TRUE)
rmse(test, p)
## End(Not run)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.