mlp: Multi Layer Percepteron

Description Usage Arguments Value Examples

View source: R/simpleNN.R

Description

mlp is a function that generates an MLP, for which you can train on data.

Usage

1
mlp(structNet, minibatchSize, activation, initPos = FALSE, initScale = 100)

Arguments

structNet

Vector inidcating the sizes of the nodes in the network. E.g. c(100,70,40,10) would be a network with 100 input nodes, 2 hidden layers with 70 and 40 neurons respectively, and an output layer with 10 neurons.

minibatchSize

Number of samples used for estimating the gradient

activation

function for neural network. Must be able to do elementwise calculations on a matrix and have the parameter deriv, which is a boolean indicating whether the derivative should be calculated

initPos

boolean indicating whether weights should be initialized as positive

initScale

scalar for initialising wieghts, e.g. if it is 100, then the randomly sampled initial weights are scaled by 1/100.

Value

environment with the functions to train the network, forwards propagate and a list with all the layers. The forward propage function can be used to do predictions.

Examples

1
2
testMLP <- mlp(c(10,10,10),5,mnistr::reLU,TRUE,1000)
testMLP$network[[1]]$W$getter() # Check random weights

gumeo/mnistr documentation built on May 17, 2019, 9:27 a.m.