Layer: Fully connected layer for neural network

Description Usage Arguments Value Examples

View source: R/simpleNN.R

Description

Layer encapsulates all the data needed for a fully connected layer.

Usage

1
2
Layer(activation, minibatchSize, sizeP, is_input = FALSE, is_output = FALSE,
  initPos, initScale)

Arguments

activation

function for neural network. Must be able to do elementwise calculations on a matrix and have the parameter deriv, which is a boolean indicating whether the derivative should be calculated

minibatchSize

Number of samples used for estimating the gradient

sizeP

vector of two values, number of inputs to this layer and number of outputs from this layer, ignoring bias values.

is_input

boolean indicating whether this is an input

is_output

boolean indicating whether this is output

initPos

boolean indicating whether weights should be initialized as positive

initScale

scalar for initialising wieghts, e.g. if it is 100, then the randomly sampled initial weights are scaled by 1/100.

Value

environment with the functions to set all the internal matricies and a function to forward propagate through the layer.

Examples

1
2
testLayer <- Layer(mnistr::reLU, 3, c(10,10),FALSE,FALSE,TRUE,1000)
testLayer$W$getter() # Check random weights

gumeo/mnistr documentation built on May 17, 2019, 9:27 a.m.