linesch_sw: strong Wolfe line search

Description Usage Arguments Value Author(s) References Examples

Description

strong Wolfe line search with cubic interpolation. Can be used by using strongwolfe=1 in BFGS. For non smooth functions, this is not recommended for BFGS. Use weak line search instead. recommended for use with CG, where strong Wolfe condition needed for convergence analysis

Usage

1
linesch_sw(fn, gr, x0, d, f0 = fn(x0), grad0 = gr(x0), c1 = 0, c2 = 0.5, fvalquit = -Inf, prtlevel = 0)

Arguments

fn

A function to be minimized. fn(x) takes input as a vector of parameters over which minimization is to take place. fn() returns a scaler.

gr

A function to return the gradient for fn(x).

x0

initial point

d

search direction

f0

fn(x0)

grad0

gr(x0)

c1

Wolfe parameter for the sufficient decrease condition

c2

c2: Wolfe parameter for the WEAK condition on directional derivative

fvalquit

quit if f gets below this value.

prtlevel

prints messages if this is 1

Value

returns a list containing the following fields:

alpha

steplength satisfying Wolfe conditions

x

x0 + alpha*d

f

f(x0 + alpha d)

grad

(grad f)(x0 + alpha d)

fail

0 if both Wolfe conditions satisfied, or falpha < fvalquit 1 if one or both Wolfe conditions not satisfied but an interval was found bracketing a point where both satisfied -1 if no such interval was found, function may be unbounded below

nsteps

number of steps taken in lszoom

Author(s)

Abhirup Mallik, Hans Werner Borchers

References

Numerical Optimization by Jorge Nocedal and Stephen J. Wright

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
fr <- function(x) {   ## Rosenbrock Banana function
  x1 <- x[1]
  x2 <- x[2]
  100 * (x2 - x1 * x1)^2 + (1 - x1)^2
}
grr <- function(x) { ## Gradient of 'fr'
  x1 <- x[1]
  x2 <- x[2]
  c(-400 * x1 * (x2 - x1 * x1) - 2 * (1 - x1),
    200 *      (x2 - x1 * x1))
}

res=linesch_sw(fr,grr,c(-1.2,1),c(1,1))

rnso documentation built on May 2, 2019, 6:12 p.m.