Description Usage Arguments Value Author(s) References See Also
For a given data set (training set), this function modifies the neural network weights and biases to approximate the relationships amongst variables present in the training set. These may serve to satisfy several needs, i.e. fitting non-linear functions.
1 2 |
net |
Neural Network to train. |
P |
Training set input values. |
T |
Training set output values |
Pval |
Validation set input values for optional early stopping. |
Tval |
Validation set output values for optional early stopping. |
error.criterium |
Criterium used to measure the goodness of fit:"LMS", "LMLS", "TAO". |
Stao |
Initial value of the S parameter used by the TAO algorithm. |
report |
Logical value indicating whether the training function should keep quiet or should provide graphical/written information during the training process instead. |
n.shows |
Number of times to report (if report is TRUE). The total number of training epochs is n.shows times show.step. |
show.step |
Number of epochs to train non-stop until the training function is allow to report. |
prob |
Vector with the probabilities of each sample so as to apply resampling training. |
n.threads |
Number of threads to spawn for the BATCH* training methods. If <1, spawns NumberProcessors-1 threads. If no OpenMP is found, this argument will be ignored. |
This function returns a list with two elements: the trained Neural Network object with weights and biases adjusted by the adaptative backpropagation with momentum method and a matrix with the errors obtained during the training. If the validation set is provided, the early stopping technique is applied.
Manuel Castejón Limas. manuel.castejon@gmail.com
Joaquin Ordieres Meré j.ordieres@upm.es
Ana González Marcos. ana.gonzalez@unirioja.es
Alpha V. Pernía Espinoza. alpha.pernia@unirioja.es
Francisco Javier Martinez de Pisón. fjmartin@unirioja.es
Fernando Alba Elías. fernando.alba@unavarra.es
Pernía Espinoza, A.V., Ordieres Meré, J.B., Martínez de Pisón, F.J., González Marcos, A. TAO-robust backpropagation learning algorithm. Neural Networks. Vol. 18, Issue 2, pp. 191–204, 2005.
Simon Haykin. Neural Networks – a Comprehensive Foundation. Prentice Hall, New Jersey, 2nd edition, 1999. ISBN 0-13-273350-1.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.