Description Usage Arguments Details Value See Also Examples
View source: R/minimizeClassifier.R
This function trains a DArch classifier network with the
conjugate gradient method.
1 2 3 4 5 6 7 | minimizeClassifier(darch, trainData, targetData,
cg.length = getParameter(".cg.length"),
cg.switchLayers = getParameter(".cg.length"),
dropout = getParameter(".darch.dropout"),
dropConnect = getParameter(".darch.dropout.dropConnect"),
matMult = getParameter(".matMult"), debugMode = getParameter(".debug"),
...)
|
darch |
A instance of the class |
trainData |
The training data matrix. |
targetData |
The labels for the training data. |
cg.length |
Numbers of line search |
cg.switchLayers |
Indicates when to train the full network instead of only the upper two layers |
dropout |
See |
dropConnect |
See |
matMult |
Matrix multiplication function, internal parameter. |
debugMode |
Whether debug mode is enabled, internal parameter. |
... |
Further parameters. |
This function is build on the basis of the code from G. Hinton et. al.
(http://www.cs.toronto.edu/~hinton/MatlabForSciencePaper.html - last visit
2016-04-30) for the fine tuning of deep belief nets. The original code is
located in the files 'backpropclassify.m', 'CG_MNIST.m' and
'CG_CLASSIFY_INIT.m'.
It implements the fine tuning for a classification net with backpropagation
using a direct translation of the minimize function from C.
Rassmussen (available at http://www.gatsby.ucl.ac.uk/~edward/code/minimize/
- last visit 2016-04-30) to R.
The parameter cg.switchLayers is for the switch between two training
type. Like in the original code, the top two layers can be trained alone
until epoch is equal to epochSwitch. Afterwards the entire
network will be trained.
minimizeClassifier supports dropout but does not use the weight
update function as defined via the darch.weightUpdateFunction
parameter of darch, so that weight decay, momentum etc. are not
supported.
The trained DArch object.
Other fine-tuning functions: backpropagation,
minimizeAutoencoder,
rpropagation
1 2 3 4 5 6 7 8 9 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.