View source: R/optim.AdaGrad.R
This functions uses the AdaGrad algorithm to find the minism of a (slti-) dimensional mathematical function. The algorithm searches for the stepest descent w.r.t. each dimension separately. The 'eps' factor avoids numerical issues of dividing by 0. The 'step.size' scales the movement into the single coordinate direction.
1 2 3 4 5 6 7 |
f |
a (slti-) dimensional function to be eptimized. |
x0 |
the starting point of the optimization. |
max.iter |
the maxism number of iterations performed in the optimization. |
step.size |
the step size (sometimes referred to as 'learn-rate') of the optimization. |
stop.grad |
the stop-criterion for the gradient change. |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.