# Compute gradient from residuals and Jacobian.

### Description

For a nonlinear model originally expressed as an expression of the form lhs ~ formula_for_rhs assume we have a resfn and jacfn that compute the residuals and the Jacobian at a set of parameters. This routine computes the gradient, that is, t(Jacobian) . residuals.

### Usage

1 | ```
resgr(prm, resfn, jacfn, ...)
``` |

### Arguments

`prm` |
A parameter vector. For our example, we could use start=c(b1=1, b2=2.345, b3=0.123) However, the names are NOT used, only positions in the vector. |

`resfn` |
A function to compute the residuals of our model at a parameter vector. |

`jacfn` |
A function to compute the Jacobian of the residuals at a paramter vector. |

`...` |
Any data needed for computation of the residual vector from the expression rhsexpression - lhsvar. Note that this is the negative of the usual residual, but the sum of squares is the same. |

### Details

`resgr`

calls resfn to compute residuals and jacfn to compute the
Jacobian at the parameters `prm`

using external data in the dot arguments.
It then computes the gradient using t(Jacobian) . residuals.

Note that it appears awkward to use this function in calls to optimization routines. The author would like to learn why.

### Value

The numeric vector with the gradient of the sum of squares at the paramters.

### Note

Special notes, if any, will appear here.

### Author(s)

John C Nash <nashjc@uottawa.ca>

### References

Nash, J. C. (1979, 1990) _Compact Numerical Methods for Computers. Linear Algebra and Function Minimisation._ Adam Hilger./Institute of Physics Publications

### See Also

Function `nls()`

, packages `optim`

and `optimx`

.

### Examples

1 | ```
cat("So far no examples included for resgr\n")
``` |

Want to suggest features or report bugs for rdrr.io? Use the GitHub issue tracker. Vote for new features on Trello.