MLP_net: MLP_net function

View source: R/MLP.R

MLP_netR Documentation

MLP_net function

Description

A function to define a multilayer perceptron and compute quantities for backpropagation, if needed.

Usage

MLP_net(input, weights, bias, dims, nlayers, activ, back = TRUE, regulariser)

Arguments

input

input data, a list of vectors (i.e. ragged array)

weights

a list object containing weights for the forward pass, see ?weights2list

bias

a list object containing biases for the forward pass, see ?bias2list

dims

the dimensions of the network as stored from a call to the function network, see ?network

nlayers

number of layers as stored from a call to the function network, see ?network

activ

list of activation functions as stored from a call to the function network, see ?network

back

logical, whether to compute quantities for backpropagation (set to FALSE for feed-forward use only)

regulariser

type of regularisation strategy to, see ?train, ?no_regularisation ?L1_regularisation, ?L2_regularisation

Value

a list object containing the evaluated forward pass and also, if selected, quantities for backpropagation.

References

  1. Ian Goodfellow, Yoshua Bengio, Aaron Courville, Francis Bach. Deep Learning. (2016)

  2. Terrence J. Sejnowski. The Deep Learning Revolution (The MIT Press). (2018)

  3. Neural Networks YouTube playlist by 3brown1blue: https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi

  4. http://neuralnetworksanddeeplearning.com/

See Also

network, train, backprop_evaluate, MLP_net, backpropagation_MLP, logistic, ReLU, smoothReLU, ident, softmax, Qloss, multinomial, NNgrad_test, weights2list, bias2list, biasInit, memInit, gradInit, addGrad, nnetpar, nbiaspar, addList, no_regularisation, L1_regularisation, L2_regularisation


deepNN documentation built on Aug. 25, 2023, 5:14 p.m.