Description Usage Arguments Details Value Note Author(s) References See Also Examples
These functions have been removed from the package temporarily, pending a permanent fix.
Function minimization using the Gnu Scientific Library, reference
manual section 35. These functions are declared in header file
gsl_multimin.h
Several algorithms for finding (local) minima of functions in one or more variables are provided. All of the algorithms operate locally, in the sense that they maintain a best guess and require the function to be continuous. Apart from the NelderMead algorithm, these algorithms also use a derivative.
1 2 3 4 5  multimin(..., prec=0.0001)
multimin.init(x, f, df=NA, fdf=NA, method=NA, step.size=NA, tol=NA)
multimin.iterate(state)
multimin.restart(state)
multimin.fminimizer.size(state)

... 
In function 
x 
A starting point. These algorithms are faster with better initial guesses 
f 
The function to minimize. This function must take a single

df 
The derivative of 
fdf 
A function that evaluates 
method 
The algorithm to use, which is one of
“ 
step.size 
This step size guides the algorithm to pick a good distance between points in its search 
tol 
This parameter is relevant for gradientbased methods. It controls how much the gradient should flatten out in each line search. More specifically, let u(t) = f(x + st) be the function restricted to the search ray. Then a point t is tolerable if u'(t) < tol u'(0). Higher values give more lax linesearches. This parameter tradesoff searching intensively in the outer loop (finding search directions) versus the inner loop (finding a good point in a particular direction) 
prec 
The stoppingrule precision parameter. For the derivativebased
methods, a solution is good enough if the norm of the gradient is smaller
than 
state 
This stores all information relating to the progress of the optimization problem 
There are two ways to call multimin
. The simple way is to
merely call multimin
directly. A more complicated way is to
call multimin.init
first, and then repeatedly call
multimin.iterate
until the guess gets good enough. In
addition, multimin.restart
can be used with the second approach
to discard accumulated information (such as curvature information) if
that information turns out to be unhelpful. This is roughly
equivalent to calling multimin.init
by setting the starting
point to be the current best guess.
All of the derivativebased methods consist of iterations that pick a descent direction, and conduct a line search for a better point along the ray in that direction from the current point. The FletcherReeves and PolakRibiere conjugate gradient algorithms maintain a a vector that summarizes the curvature at that point. These are useful for highdimensional problems (eg: more than 100 dimensions) because they don't use matrices which become expensive to keep track of. The BroydenFletcherGoldfarbShanno is better for lowdimensional problems, since it maintains an approximation of the Hessian of the function as well, which gives better curvature information. The steepestdescent algorithm is a naive algorithm that does not use any curvature information. The NelderMead algorithm which does not use derivatives.
All of these functions return a state variable, which consists of the following items:
internal.state 
Bureaucratic stuff for communicating with GSL 
x 
The current best guess of the optimal solution 
f 
The value of the function at the best guess 
df 
The derivative of the function at the best guess 
is.fdf 
TRUE if the algorithm is using a derivative 
code 
The GSL return code from the last iteration 
The source code for the functions documented here conditionalizes
on WIN32
; under windows there is a slight memory leak.
Andrew Clausen [email protected]
http://www.gnu.org/software/gsl
optim
and nlm
are the standard optimization functions in
R.
deriv
and D
are the standard symbolic differentation
functions in R. Ryacas
provides more extensive differentiation
support using Yet Another Computer Algebra System.
numericDeriv
is the standard numerical differentation function
in R. GSL can also do numerical differentiation, but noone has
written an R interface yet.
multimin
requires the objective function to have a single
(vector) argument. unlist
and relist
are useful for
converting between more convenient forms.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16  # The Rosenbrock function:
x0 < c(1.2, 1)
f < function(x) (1  x[1])^2 + 100 * (x[2]  x[1]^2)^2
df < function(x) c(2*(1  x[1]) + 100 * 2 * (x[2]  x[1]^2) * (2*x[1]),
100 * 2 * (x[2]  x[1]^2))
# The simple way to call multimin.
state < multimin(x0, f, df)
print(state$x)
# The finecontrol way to call multimin.
state < multimin.init(x0, f, df, method="conjugatefr")
for (i in 1:200)
state < multimin.iterate(state)
print(state$x)

Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.