backpropagation_MLP: backpropagation_MLP function

Description Usage Arguments Value References See Also

View source: R/MLP.R

Description

A function to perform backpropagation for a multilayer perceptron.

Usage

1
backpropagation_MLP(MLPNet, loss, truth)

Arguments

MLPNet

output from the function MLP_net, as applied to some data with given parameters

loss

the loss function, see ?Qloss and ?multinomial

truth

the truth, a list of vectors to compare with output from the feed-forward network

Value

a list object containing the cost and the gradient with respect to each of the model parameters

References

  1. Ian Goodfellow, Yoshua Bengio, Aaron Courville, Francis Bach. Deep Learning. (2016)

  2. Terrence J. Sejnowski. The Deep Learning Revolution (The MIT Press). (2018)

  3. Neural Networks YouTube playlist by 3brown1blue: https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi

  4. http://neuralnetworksanddeeplearning.com/

See Also

network, train, backprop_evaluate, MLP_net, backpropagation_MLP, logistic, ReLU, smoothReLU, ident, softmax, Qloss, multinomial, NNgrad_test, weights2list, bias2list, biasInit, memInit, gradInit, addGrad, nnetpar, nbiaspar, addList, no_regularisation, L1_regularisation, L2_regularisation


deepNN documentation built on March 13, 2020, 2:24 a.m.