Generate a matrix of function derivative information.

1 2 3 4 5 |

`func` |
a function for which the first (vector) argument is used as a parameter vector. |

`x` |
The parameter vector first argument to |

`method` |
one of |

`method.args` |
arguments passed to method. See |

`...` |
any additional arguments passed to |

The derivatives are calculated numerically using Richardson improvement.
Methods "simple" and "complex" are not supported in this function.
The "Richardson" method calculates a numerical approximation of the first
and second derivatives of `func`

at the point `x`

.
For a scalar valued function these are the gradient vector and
Hessian matrix. (See `grad`

and `hessian`

.)
For a vector valued function the first derivative is the Jacobian matrix
(see `jacobian`

).
For the Richardson method
```
method.args=list(eps=1e-4, d=0.0001, zero.tol=sqrt(.Machine$double.eps/7e-7),
r=4, v=2)
```

is set as the default.
See `grad`

for more details on the Richardson's extrapolation parameters.

A simple approximation to the first order derivative with respect
to *x_i* is

*
f'_{i}(x) = <f(x_{1},…,x_{i}+d,…,x_{n}) -
f(x_{1},…,x_{i}-d,…,x_{n})>/(2*d)*

A simple approximation to the second order derivative with respect
to *x_i* is

*
f''_{i}(x) = <f(x_{1},…,x_{i}+d,…,x_{n}) -
2 *f(x_{1},…,x_{n}) +
f(x_{1},…,x_{i}-d,…,x_{n})>/(d^2) *

The second order derivative with respect to *x_i, x_j* is

*
f''_{i,j}(x) = <f(x_{1},…,x_{i}+d,…,x_{j}+d,…,x_{n}) -
2 *f(x_{1},…,x_{n}) + *

*
f(x_{1},…,x_{i}-d,…,x_{j}-d,…,x_{n})>/(2*d^2) -
(f''_{i}(x) + f''_{j}(x))/2 *

Richardson's extrapolation is based on these formula with the `d`

being reduced in the extrapolation iterations. In the code, `d`

is
scaled to accommodate parameters of different magnitudes.

`genD`

does `1 + r (N^2 + N)`

evaluations of the function
`f`

, where `N`

is the length of `x`

.

A list with elements as follows:
`D`

is a matrix of first and second order partial
derivatives organized in the same manner as Bates and
Watts, the number of rows is equal to the length of the result of
`func`

, the first p columns are the Jacobian, and the
next p(p+1)/2 columns are the lower triangle of the second derivative
(which is the Hessian for a scalar valued `func`

).
`p`

is the length of `x`

(dimension of the parameter space).
`f0`

is the function value at the point where the matrix `D`

was calculated.
The `genD`

arguments `func`

, `x`

, `d`

, `method`

,
and `method.args`

also are returned in the list.

Linfield, G.R. and Penny, J.E.T. (1989) "Microcomputers in Numerical Analysis." Halsted Press.

Bates, D.M. & Watts, D. (1980), "Relative Curvature Measures of Nonlinearity." J. Royal Statistics Soc. series B, 42:1-25

Bates, D.M. and Watts, D. (1988) "Non-linear Regression Analysis and Its Applications." Wiley.

1 2 |

Questions? Problems? Suggestions? Tweet to @rdrrHQ or email at ian@mutexlabs.com.

Please suggest features or report bugs with the GitHub issue tracker.

All documentation is copyright its authors; we didn't write any of that.