addGrad: addGrad function

View source: R/miscellaneous.R

addGradR Documentation

addGrad function

Description

A function to add two gradients together, gradients expressed as nested lists.

Usage

addGrad(x, y)

Arguments

x

a gradient list object, as used in network training via backpropagation

y

a gradient list object, as used in network training via backpropagation

Value

another gradient object

References

  1. Ian Goodfellow, Yoshua Bengio, Aaron Courville, Francis Bach. Deep Learning. (2016)

  2. Terrence J. Sejnowski. The Deep Learning Revolution (The MIT Press). (2018)

  3. Neural Networks YouTube playlist by 3brown1blue: https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi

  4. http://neuralnetworksanddeeplearning.com/

See Also

network, train, backprop_evaluate, MLP_net, backpropagation_MLP, logistic, ReLU, smoothReLU, ident, softmax, Qloss, multinomial, NNgrad_test, weights2list, bias2list, biasInit, memInit, gradInit, addGrad, nnetpar, nbiaspar, addList, no_regularisation, L1_regularisation, L2_regularisation


deepNN documentation built on Aug. 25, 2023, 5:14 p.m.