View source: R/sparse_coding.R
sparse_coding | R Documentation |
An implementation of Sparse Coding with Dictionary Learning. Given a dataset, this will decompose the dataset into a sparse combination of a few dictionary elements, where the dictionary is learned during computation; a dictionary can be reused for future sparse coding of new points.
sparse_coding(
atoms = NA,
initial_dictionary = NA,
input_model = NA,
lambda1 = NA,
lambda2 = NA,
max_iterations = NA,
newton_tolerance = NA,
normalize = FALSE,
objective_tolerance = NA,
seed = NA,
test = NA,
training = NA,
verbose = getOption("mlpack.verbose", FALSE)
)
atoms |
Number of atoms in the dictionary. Default value "15" (integer). |
initial_dictionary |
Optional initial dictionary matrix (numeric matrix). |
input_model |
File containing input sparse coding model (SparseCoding). |
lambda1 |
Sparse coding l1-norm regularization parameter. Default value "0" (numeric). |
lambda2 |
Sparse coding l2-norm regularization parameter. Default value "0" (numeric). |
max_iterations |
Maximum number of iterations for sparse coding (0 indicates no limit). Default value "0" (integer). |
newton_tolerance |
Tolerance for convergence of Newton method. Default value "1e-06" (numeric). |
normalize |
If set, the input data matrix will be normalized before coding. Default value "FALSE" (logical). |
objective_tolerance |
Tolerance for convergence of the objective function. Default value "0.01" (numeric). |
seed |
Random seed. If 0, 'std::time(NULL)' is used. Default value "0" (integer). |
test |
Optional matrix to be encoded by trained model (numeric matrix). |
training |
Matrix of training data (X) (numeric matrix). |
verbose |
Display informational messages and the full list of parameters and timers at the end of execution. Default value "getOption("mlpack.verbose", FALSE)" (logical). |
An implementation of Sparse Coding with Dictionary Learning, which achieves sparsity via an l1-norm regularizer on the codes (LASSO) or an (l1+l2)-norm regularizer on the codes (the Elastic Net). Given a dense data matrix X with d dimensions and n points, sparse coding seeks to find a dense dictionary matrix D with k atoms in d dimensions, and a sparse coding matrix Z with n points in k dimensions.
The original data matrix X can then be reconstructed as Z * D. Therefore, this program finds a representation of each point in X as a sparse linear combination of atoms in the dictionary D.
The sparse coding is found with an algorithm which alternates between a dictionary step, which updates the dictionary D, and a sparse coding step, which updates the sparse coding matrix.
Once a dictionary D is found, the sparse coding model may be used to encode other matrices, and saved for future usage.
To run this program, either an input matrix or an already-saved sparse coding model must be specified. An input matrix may be specified with the "training" option, along with the number of atoms in the dictionary (specified with the "atoms" parameter). It is also possible to specify an initial dictionary for the optimization, with the "initial_dictionary" parameter. An input model may be specified with the "input_model" parameter.
A list with several components:
codes |
Matrix to save the output sparse codes of the test matrix (–test_file) to (numeric matrix). |
dictionary |
Matrix to save the output dictionary to (numeric matrix). |
output_model |
File to save trained sparse coding model to (SparseCoding). |
mlpack developers
# As an example, to build a sparse coding model on the dataset "data" using
# 200 atoms and an l1-regularization parameter of 0.1, saving the model into
# "model", use
## Not run:
output <- sparse_coding(training=data, atoms=200, lambda1=0.1)
model <- output$output_model
## End(Not run)
# Then, this model could be used to encode a new matrix, "otherdata", and
# save the output codes to "codes":
## Not run:
output <- sparse_coding(input_model=model, test=otherdata)
codes <- output$codes
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.