Description Usage Arguments Details Value Note Author(s) References See Also Examples
An implementation of the HookeJeeves algorithm for derivativefree optimization.
This is a slight adaption hjk()
from package
dfoptim
1 
par 
Starting vector of parameter values. The initial vector may lie on the boundary. If 
fn 
Nonlinear objective function that is to be optimized. A scalar function that takes a real vector as argument and returns a scalar that is the value of the function at that point. 
control 

... 
Additional arguments passed to 
Argument control
is a list specifing changes to default values of
algorithm control parameters.
Note that parameter names may be abbreviated as long as they are unique.
The list items are as follows:
tol
Convergence tolerance. Iteration is terminated when the
step length of the main loop becomes smaller than tol
. This does
not imply that the optimum is found with the same accuracy.
Default is 1.e06.
maxfeval
Maximum number of objective function evaluations allowed. Default is Inf, that is no restriction at all.
maximize
A logical indicating whether the objective function is to be maximized (TRUE) or minimized (FALSE). Default is FALSE.
target
A real number restricting the absolute function value. The procedure stops if this value is exceeded. Default is Inf, that is no restriction.
info
A logical variable indicating whether the step number, number of function calls, best function value, and the first component of the solution vector will be printed to the console. Default is FALSE.
If the minimization process threatens to go into an infinite loop, set
either maxfeval
or target
.
A list
with the following components:
par 
Best estimate of the parameter vector found by the algorithm. 
value 
value of the objective function at termination. 
convergence 
indicates convergence ( 
feval 
number of times the objective 
niter 
number of iterations (“steps”) in the main loop. 
This algorithm is based on the Matlab code of Prof. C. T. Kelley, given in his book “Iterative methods for optimization”. It has been implemented for package dfoptim with the permission of Prof. Kelley.
This version does not (yet) implement a cache for storing function values that have already been computed as searching the cache makes it slower.
Hans W Borchers [email protected]; for Rmpfr: John Nash, June 2012. Modifications by Martin Maechler.
C.T. Kelley (1999), Iterative Methods for Optimization, SIAM.
Quarteroni, Sacco, and Saleri (2007), Numerical Mathematics, Springer.
Standard R's optim
;
optimizeR
provides onedimensional minimization
methods that work with mpfr
class numbers.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37  ## simple smooth example:
ff < function(x) sum((x  c(2:4))^2)
str(rr < hjkMpfr(rep(mpfr(0,128), 3), ff, control=list(info=TRUE)))
## HookeJeeves solves highdim. Rosenbrock function {but slowly!}
rosenbrock < function(x) {
n < length(x)
sum (100*((x1 < x[1:(n1)])^2  x[2:n])^2 + (x1  1)^2)
}
par0 < rep(0, 10)
str(rb.db < hjkMpfr(rep(0, 10), rosenbrock, control=list(info=TRUE)))
## rosenbrook() is quite slow with mpfrnumbers:
str(rb.M. < hjkMpfr(mpfr(numeric(10), prec=128), rosenbrock,
control = list(tol = 1e8, info=TRUE)))
## HookeJeeves does not work well on nonsmooth functions
nsf < function(x) {
f1 < x[1]^2 + x[2]^2
f2 < x[1]^2 + x[2]^2 + 10 * (4*x[1]  x[2] + 4)
f3 < x[1]^2 + x[2]^2 + 10 * (x[1]  2*x[2] + 6)
max(f1, f2, f3)
}
par0 < c(1, 1) # true min 7.2 at (1.2, 2.4)
h.d < hjkMpfr(par0, nsf) # fmin=8 at xmin=(2,2)
## and this is not at all better (but slower!)
h.M < hjkMpfr(mpfr(c(1,1), 128), nsf, control = list(tol = 1e15))
## > demo(hjkMpfr) # > Fletcher's chebyquad function m = n  residuals

Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.