mla: An Algorithm for Least-squares Curve Fitting

Description Usage Arguments Details Value Author(s) References Examples

View source: R/mla.R

Description

This algorithm provides a numerical solution to the problem of minimizing a function. This is more efficient than the Gauss-Newton-like algorithm when starting from points very far from the final minimum. A new convergence test is implemented (RDM) in addition to the usual stopping criterion: stopping rule is when the gradients are small enough in the parameters metric (GH-1G).

This algorithm provides a numerical solution to the problem of minimizing a function. This is more efficient than the Gauss-Newton-like algorithm when starting from points very far from the final minimum. A new convergence test is implemented (RDM) in addition to the usual stopping criterion: stopping rule is when the gradients are small enough in the parameters metric (GH-1G).

Usage

1
2
3
4
5
6
7
8
9
mla(
  par,
  fn,
  gr = NULL,
  ...,
  hessian = FALSE,
  control = list(),
  verbose = FALSE
)

Arguments

par

A vector containing the initial values for the parameters.

fn

The function to be minimized (or maximized), with argument the vector of parameters over which minimization is to take place. It should return a scalar result.

gr

A function to return the gradient value for a specific point. If missing, finite-difference approximation will be used.

...

Further arguments to be passed to fn and gr.

hessian

A function to return the hessian matrix for a specific point. If missing, finite-difference approximation will be used.

control

A list of control parameters. See ‘Details’.

verbose

Equals to TRUE if report (parameters at iteration, function value, convergence criterion ...) at each iteration is requested. Default value is FALSE.

Details

Convergence criteria are very strict as they are based on derivatives of the log-likelihood in addition to the parameter and log-likelihood stability. In some cases, the program may not converge and reach the maximum number of iterations fixed at 500. In this case, the user should check that parameter estimates at the last iteration are not on the boundaries of the parameter space. If the parameters are on the boundaries of the parameter space, the identifiability of the model should be assessed. If not, the program should be run again with other initial values, with a higher maximum number of iterations or less strict convergence tolerances. The control argument is a list that can supply any of the following components:

Value

A list with the following elements:

Author(s)

Alessandro Gasparini (alessandro.gasparini@ki.se)

Daniel Commenges

Melanie Prague

Amadou Diakite

Alessandro Gasparini

References

Donald W. Marquardt (1963). An algorithm for least-squares estimation of nonlinear parameters. Journal of the Society for Industrial and Applied Mathematics, 11(2):431–441

Daniel Commenges, H Jacqmin-Gadda, C. Proust, J. Guedj (2006). A Newton-like algorithm for likelihood maximization the robust-variance scoring algorithm. arxiv:math/0610402v2

Donald W. Marquardt (1963). An algorithm for least-squares estimation of nonlinear parameters. Journal of the Society for Industrial and Applied Mathematics, 11(2):431–441

Daniel Commenges, H Jacqmin-Gadda, C. Proust, J. Guedj (2006). A Newton-like algorithm for likelihood maximization the robust-variance scoring algorithm. arxiv:math/0610402v2

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
### 1
### initial values
par <- c(8, 9)
### your function
fn <- function(b) {
  return(4 * (b[1] - 5)^2 + (b[2] - 6)^2)
}
## Call
test1 <- mla(par = par, fn = fn)
test1

### 2
### initial values
b <- c(3, -1, 0, 1)
### your function
f2 <- function(b) {
  return((b[1] + 10 * b[2])^2 + 5 * (b[3] - b[4])^2 + (b[2] - 2 * b[3])^4 + 10 * (b[1] - b[4])^4)
}

## Call
test2 <- mla(par = b, fn = f2)
test2

ellessenne/mla documentation built on March 29, 2020, 12:20 p.m.