Description Super class Public fields Methods Note See Also Examples
Setup a sequential (uni-directional) deep feed forward neural network. The Sequential is used as a wrapper for the modules.
neuralnetr::ClassModule -> Sequential
lossThe loss module
modulesthe list of current modules
new()setup the neural network
Sequential$new(modules, loss)
moduleslist of modules
lossthe loss module
sgd()train neural network using stochastic gradient descent
Sequential$sgd(X, Y, iters = 100, lrate = 0.005, verbose = F, seed = 1)
Xthe X input (m x b)
Ythe Y (target) input
itersamount of iterations.
lratethe learning rate.
verboseprint results every epoch.
seedrandom seed.
cumulative loss for every iteration.
mini_gd()train neural network using minibatch gradient descent.
Sequential$mini_gd( X, Y, iters = 100, lrate = 0.005, K = 5, verbose = F, seed = 1 )
Xthe X input (m x b)
Ythe Y (target) input
itersamount of iterations.
lratethe learning rate.
Kthe size of the minibatch.
verboseprint results every epoch.
seedrandom seed.
cumulative loss for every iteration.
forward()Compute Ypred
Sequential$forward(Xt)
Xtthe input at time t
backward()Update dLdW and dLdW0
Sequential$backward(delta)
deltathe backpropagated error
sgd_step()gradient descent step
Sequential$sgd_step(lrate)
lratethe learning rate
classify()classify the labels of the input
Sequential$classify(X)
Xinput X
print_accuarcy()print accuracy during training
Sequential$print_accuarcy(it, X, Y, cur_loss, every = 250)
ititeration number
Xdata X
Ytarget Y
cur_lossthe current loss
everyhow often should the function return feeddback?
clone()The objects of this class are cloneable with this method.
Sequential$clone(deep = FALSE)
deepWhether to make a deep clone.
Note that delta can refer to dLdA or dLdZ over the course of the for loop, depending on the module m
Other architecture: 
BatchNorm,
Linear
| 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 | 
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.