Linear: Linear modules

Description Super class Methods See Also

Description

A linear moduls implements a linear transformation:

(latex code)

specified by a weight matrix w and a bias vector w0. Each linear module has a forward method that takes in a batch of activations A (from the previous layer) and returns a batch of pre-activations Z.

Each linear module has a backward method that takes in dLdZ and returns dLdA. This module also computes and stores dLdW and dLdW0, the gradients with respect to the weights.

Super class

neuralnetr::ClassModule -> Linear

Methods

Public methods

Inherited methods

Method new()

initialize the weights.

Usage
Linear$new(m, n)
Arguments
m

the m dimension of the module.

n

the n dimension of the module.


Method forward()

do one step forward.

Usage
Linear$forward(A)
Arguments
A

input activation (m x b)

Returns

Z pre-activation (n x b)


Method backward()

do one gradient step backward

Usage
Linear$backward(dLdZ)
Arguments
dLdZ

the derivative of the loss with respect to Z (n x b)

Returns

dLdA (m x b)


Method sgd_step()

update the weights using stochastic gradient descent

Usage
Linear$sgd_step(lrate)
Arguments
lrate

learning rate


Method clone()

The objects of this class are cloneable with this method.

Usage
Linear$clone(deep = FALSE)
Arguments
deep

Whether to make a deep clone.

See Also

Other architecture: BatchNorm, Sequential


frhl/neuralnetr documentation built on Nov. 9, 2020, 2:24 p.m.