doChainRule: Back-propagation of derivative calculation

Description Usage Arguments Value See Also

Description

The getGradientFunction closure creates a gradient function specific to a given vertex. It takes as an argument the weights corresponding to the incoming parents of that vertex. When this weight vector is varied, the predictions on the observed vertices change, and thus the cost function output changes. The gradient function captures that rate of change. It relies on back-propagation, calculating the rates of change for every downstream node that is affected by changing the weight vector. This function does that back-propagation calculation for each element of the weight vector

Usage

1
doChainRule(g, v, e)

Arguments

g

a model

v

vertex index whose incoming weights are the argument for the gradient function

e

edge index of the individual element of the weight vector where the derivative is being calculated

Value

a vector corresponding to the derivative

See Also

gradientDescent, getGradientFunction, doChainRule


robertness/signalgraph documentation built on May 27, 2019, 10:33 a.m.