Use gradient descent to find local minima

1 2 3 4 5 |

`fp` |
function representing the derivative of |

`x` |
an initial estimate of the minima |

`h` |
the step size |

`tol` |
the error tolerance |

`m` |
the maximum number of iterations |

Gradient descent can be used to find local minima of functions. It
will return an approximation based on the step size `h`

and
`fp`

. The `tol`

is the error tolerance, `x`

is the
initial guess at the minimum. This implementation also stops after
`m`

iterations.

the `x`

value of the minimum found

Other optimz: `bisection`

,
`goldsect`

, `hillclimbing`

,
`newton`

, `sa`

,
`secant`

1 2 3 4 5 6 7 8 9 10 11 |

Questions? Problems? Suggestions? Tweet to @rdrrHQ or email at ian@mutexlabs.com.

All documentation is copyright its authors; we didn't write any of that.