Description Usage Arguments Value
Performs a quasi-Newton line search optimization with BFGS updates of the inverse hessian. See Nocedal and Wright (2006), Chapter 3 for details.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
obj |
function to be minimised which takes parameter vector as first argument |
gobj |
(optional) gradient function of f, uses finite differencing otherwise |
funit |
scaling for function values, try to pick this so that f values vary between -1 and 1 |
units |
a vector of same length as start giving coordinate units, try to pick units such that values of parameter in that coordinate direction vary between -1 and 1 |
maxit |
maximum number of iterations to try |
tol |
tolerance for stopping criteria |
stepmax |
maximum relative step length, default of 1 means no steps longer than full Newton step |
maxsubsteps |
maximum number of substeps in line search |
save |
if TRUE then parameter values, function values, and gradients are saved for every iteration |
verbose |
if TRUE, function value and then parameter value are printed for each iteration |
digits |
number of digits to print when verbose = TRUE |
... |
other named arguments to f and gobj (if given) |
start |
vector of starting values |
a list with elements:
estimate - a vector of the optimal estimates
value - value of the objective function at the optimum
g - gradient vector at the optimum
H - hessian at the optimum computed by finite difference
conv - is TRUE if convergence was satisfied, otherwise may not have converged on optimum
niter: number of iterations taken
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.