adaGrad: Optimize mathematical function using the AdaGrad algorithm

Description Usage Arguments

View source: R/optim.AdaGrad.R

Description

This functions uses the AdaGrad algorithm to find the minism of a (slti-) dimensional mathematical function. The algorithm searches for the stepest descent w.r.t. each dimension separately. The 'eps' factor avoids numerical issues of dividing by 0. The 'step.size' scales the movement into the single coordinate direction.

Usage

1
2
3
4
5
6
7
adaGrad(
  f,
  x0,
  max.iter = 100,
  step.size = 0.1,
  stop.grad = .Machine$double.eps
)

Arguments

f

a (slti-) dimensional function to be eptimized.

x0

the starting point of the optimization.

max.iter

the maxism number of iterations performed in the optimization.

step.size

the step size (sometimes referred to as 'learn-rate') of the optimization.

stop.grad

the stop-criterion for the gradient change.


PhilippScheller/visualDescend documentation built on Feb. 5, 2020, 4:04 a.m.