Sequential: Sequential neural networks

Description Super class Public fields Methods Note See Also Examples

Description

Setup a sequential (uni-directional) deep feed forward neural network. The Sequential is used as a wrapper for the modules.

Super class

neuralnetr::ClassModule -> Sequential

Public fields

loss

The loss module

modules

the list of current modules

Methods

Public methods

Inherited methods

Method new()

setup the neural network

Usage
Sequential$new(modules, loss)
Arguments
modules

list of modules

loss

the loss module


Method sgd()

train neural network using stochastic gradient descent

Usage
Sequential$sgd(X, Y, iters = 100, lrate = 0.005, verbose = F, seed = 1)
Arguments
X

the X input (m x b)

Y

the Y (target) input

iters

amount of iterations.

lrate

the learning rate.

verbose

print results every epoch.

seed

random seed.

Returns

cumulative loss for every iteration.


Method mini_gd()

train neural network using minibatch gradient descent.

Usage
Sequential$mini_gd(
  X,
  Y,
  iters = 100,
  lrate = 0.005,
  K = 5,
  verbose = F,
  seed = 1
)
Arguments
X

the X input (m x b)

Y

the Y (target) input

iters

amount of iterations.

lrate

the learning rate.

K

the size of the minibatch.

verbose

print results every epoch.

seed

random seed.

Returns

cumulative loss for every iteration.


Method forward()

Compute Ypred

Usage
Sequential$forward(Xt)
Arguments
Xt

the input at time t


Method backward()

Update dLdW and dLdW0

Usage
Sequential$backward(delta)
Arguments
delta

the backpropagated error


Method sgd_step()

gradient descent step

Usage
Sequential$sgd_step(lrate)
Arguments
lrate

the learning rate


Method classify()

classify the labels of the input

Usage
Sequential$classify(X)
Arguments
X

input X


Method print_accuarcy()

print accuracy during training

Usage
Sequential$print_accuarcy(it, X, Y, cur_loss, every = 250)
Arguments
it

iteration number

X

data X

Y

target Y

cur_loss

the current loss

every

how often should the function return feeddback?


Method clone()

The objects of this class are cloneable with this method.

Usage
Sequential$clone(deep = FALSE)
Arguments
deep

Whether to make a deep clone.

Note

Note that delta can refer to dLdA or dLdZ over the course of the for loop, depending on the module m

See Also

Other architecture: BatchNorm, Linear

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
## Not run: 
## classification problem
nn = Sequential$new(list(
Linear$new(2,10), ReLU$new(),
Linear$new(10,10), Tanh$new(),
Linear$new(10,2), SoftMax$new()),
NLL$new())

data(xor_example)
X = xor_example$X
Y = xor_example$Y

nn$mini_gd(X, Y, 2500, 0.05,  K = 2)


## End(Not run)

frhl/neuralnetr documentation built on Nov. 9, 2020, 2:24 p.m.