Description Usage Arguments Details Value See Also Examples
View source: R/minimizeAutoencoder.R
This function trains a DArch autoencoder network with the
conjugate gradient method.
1 2 3 4 5 6 | minimizeAutoencoder(darch, trainData, targetData,
cg.length = getParameter(".cg.length"),
dropout = getParameter(".darch.dropout"),
dropConnect = getParameter(".darch.dropout.dropConnect"),
matMult = getParameter(".matMult"), debugMode = getParameter(".debug"),
...)
|
darch |
A instance of the class |
trainData |
The training data matrix. |
targetData |
The labels for the training data. |
cg.length |
Numbers of line search |
dropout |
See |
dropConnect |
See |
matMult |
Matrix multiplication function, internal parameter. |
debugMode |
Whether debug mode is enabled, internal parameter. |
... |
Further parameters. |
This function is built on the basis of the code from G. Hinton et. al.
(http://www.cs.toronto.edu/~hinton/MatlabForSciencePaper.html - last visit
2016-04-30) for the fine tuning of deep belief nets. The original code is
located in the files 'backpropclassify.m', 'CG_MNIST.m' and
'CG_CLASSIFY_INIT.m'.
It implements the fine tuning for a classification net with backpropagation
using a direct translation of the minimize function from C.
Rassmussen (available at http://www.gatsby.ucl.ac.uk/~edward/code/minimize/
- last visit 2016-04-30) to R.
minimizeAutoencoder supports dropout but does not use the weight
update function as defined via the darch.weightUpdateFunction
parameter of darch, so that weight decay, momentum etc. are not
supported.
The trained DArch object.
Other fine-tuning functions: backpropagation,
minimizeClassifier,
rpropagation
1 2 3 4 5 6 7 8 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.