Description Usage Arguments Details Value Examples
Optimization method.
Conjugate gradient optimimzation routine. A translation and slight
modification of the Matlab code minimize.m
by
Carl Edward Rasmussen.
1 2 3 4 5  conj_grad(par, fn, gr, c1 = c2/2, c2 = 0.1, max_iter = 1000,
max_line_fn = Inf, red = 1, max_alpha_mult = 10,
abstol = .Machine$double.eps, reltol = sqrt(.Machine$double.eps),
line_search = "r", ortho_restart = FALSE, nu = 0.1, prplus = FALSE,
eps = .Machine$double.eps, verbose = FALSE, debug = FALSE, ...)

par 
Vector of initial numeric parameters. 
fn 
Function to optimize. Should take a vector of the length of

gr 
Gradient of 
c1 
Constant used in sufficient decrease condition. Should take a value between 0 and 1. 
c2 
Constant used in curvature condition. Should take a value between

max_iter 
Maximum number of iterations to carry out the optimization. 
max_line_fn 
Maximum number of evaluations of 
red 
Scalar to determine initial step size guess. 
max_alpha_mult 
Maximum scale factor to use when guessing the initial step size for the next iteration. 
abstol 
Absolute tolerance. If not 
reltol 
Relative tolerance. If not 
line_search 
Line search type. Can be one of
You may also assign a line search function directly to this parameter. See 'Details' for more information. 
ortho_restart 
If 
nu 
If the dot product of the old and new conjugate gradient direction
(normalized with respect to inner product of the new direction) exceeds
this value, then the two directions are considered nonorthogonal and the
search direction is reset to steepest descent. Only used if

prplus 
If 
eps 
Epsilon for avoiding numerical issues. 
verbose 
If 
debug 
If 
... 
Other parameters to pass to the 
The line_search
parameter takes one of two string values indicating
the type of line search function desired: either the original line search
from the Matlab code, or the modified More'Thuente line search algorithm
implemented in MINPACK.
Alternatively, you may provide your own line search function. Doing so requires some knowledge of internal interfaces, which is described in the package help documentation which can be accessed via the following help command:
package?rconjgrad
Factory functions that generate a suitable line search function for the
existing line search methods are more_thuente
and
rasmussen
.
List containing:
par
Optimized parameters.
value
Return value of fn
for the optimized parameters.
values
Vector of return values of fn
at each iteration
of the optimization.
iter
Number of iterations optimization took place over.
counts
Sublist of two values, giving the number of evaluations
of fn
and gr
, respectively.
nresets
Number of times the optimizer was reset to steepest
descent.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48  # The venerable Rosenbrock Banana function
rosenbrock_banana < list(
fr = function(x) {
x1 < x[1]
x2 < x[2]
100 * (x2  x1 * x1) ^ 2 + (1  x1) ^ 2
},
grr = function(x) {
x1 < x[1]
x2 < x[2]
c(400 * x1 * (x2  x1 * x1)  2 * (1  x1),
200 * (x2  x1 * x1))
})
# Default is to use Rasmussen line search with c1 = 0.05 and c2 = 0.1
res < conj_grad(par = c(1.2, 1),
fn = rosenbrock_banana$fr, gr = rosenbrock_banana$grr)
# Turning down eps, abstol and reltol to compare with Matlab result at
# http://learning.eng.cam.ac.uk/carl/code/minimize/
# But must also put an upper limit on the number of line function evaluations
# because numerical errors prevent convergence (so don't do this normally!)
res < conj_grad(par = c(0, 0),
fn = rosenbrock_banana$fr, gr = rosenbrock_banana$grr,
eps = .Machine$double.xmin, reltol = .Machine$double.xmin,
abstol = .Machine$double.xmin, max_line_fn = 20)
res$par # c(1, 1)
res$value # 1.232595e32
res$iter # 19 iterations
res$counts # c(79, 79) 79 fn and 79 gr evaluations
# Use More'Thuente line search with typical CG Wolfe parameters mentioned
# in Nocedal and Wright's book on numerical optimization.
res < conj_grad(par = c(1.2, 1),
fn = rosenbrock_banana$fr, gr = rosenbrock_banana$grr,
line_search = "mt",
c1 = 1e4, c2 = 0.1)
## Not run:
# Can pass a function to line_search if you want to write your own
# line search function. This example is the same as the previous one, but
# uses the MoreThuente factory function.
# Yes, you do have to specify c1 and c2 in two separate places. Sorry.
res < conj_grad(par = c(1.2, 1),
fn = rosenbrock_banana$fr, gr = rosenbrock_banana$grr,
line_search = more_thuente(c1 = 1e4, c2 = 0.1),
c1 = 1e4, c2 = 0.1)
## End(Not run)

Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.