steep_descent | R Documentation |
Function minimization by steepest descent.
steep_descent(x0, f, g = NULL, info = FALSE,
maxiter = 100, tol = .Machine$double.eps^(1/2))
x0 |
start value. |
f |
function to be minimized. |
g |
gradient function of |
info |
logical; shall information be printed on every iteration? |
maxiter |
max. number of iterations. |
tol |
relative tolerance, to be used as stopping rule. |
Steepest descent is a line search method that moves along the downhill direction.
List with following components:
xmin |
minimum solution found. |
fmin |
value of |
niter |
number of iterations performed. |
Used some Matlab code as described in the book “Applied Numerical Analysis Using Matlab” by L. V.Fausett.
Nocedal, J., and S. J. Wright (2006). Numerical Optimization. Second Edition, Springer-Verlag, New York, pp. 22 ff.
fletcher_powell
## Rosenbrock function: The flat valley of the Rosenbruck function makes
## it infeasible for a steepest descent approach.
# rosenbrock <- function(x) {
# n <- length(x)
# x1 <- x[2:n]
# x2 <- x[1:(n-1)]
# sum(100*(x1-x2^2)^2 + (1-x2)^2)
# }
# steep_descent(c(1, 1), rosenbrock)
# Warning message:
# In steep_descent(c(0, 0), rosenbrock) :
# Maximum number of iterations reached -- not converged.
## Sphere function
sph <- function(x) sum(x^2)
steep_descent(rep(1, 10), sph)
# $xmin 0 0 0 0 0 0 0 0 0 0
# $fmin 0
# $niter 2
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.