View source: R/modified_newton.R
| modified_newton | R Documentation |
Implements an optimized Newton-Raphson algorithm for non-linear optimization featuring dynamic ridge adjustment and backtracking line search.
modified_newton(
start,
objective,
gradient = NULL,
hessian = NULL,
lower = -Inf,
upper = Inf,
control = list(),
...
)
start |
Numeric vector. Starting values for the optimization parameters. |
objective |
Function. The objective function to minimize. |
gradient |
Function (optional). Gradient of the objective function. |
hessian |
Function (optional). Hessian matrix of the objective function. |
lower |
Numeric vector. Lower bounds for box constraints. |
upper |
Numeric vector. Upper bounds for box constraints. |
control |
List. Control parameters including convergence flags:
|
... |
Additional arguments passed to objective, gradient, and Hessian functions. |
modified_newton is a line search optimization algorithm that utilizes
second-order curvature information (the Hessian matrix) to find the minimum
of an objective function.
Modified Newton vs. Trust-Region:
Unlike the dogleg and double_dogleg functions which use a
Trust-Region approach to constrain the step size, this function uses a
Line Search approach. It first determines the Newton direction
(the solution to H \Delta x = -g) and then performs a backtracking line
search to find a step length \alpha that satisfies the sufficient decrease
condition (Armijo condition).
Dynamic Ridge Adjustment:
If the Hessian matrix H is not positive definite (making it unsuitable for
Cholesky decomposition), the algorithm applies a dynamic ridge adjustment.
A diagonal matrix \tau I is added to the Hessian, where \tau is
increased until the matrix becomes positive definite. This ensures the
search direction always remains a descent direction.
Differentiation Methods: The function allows for independent selection of differentiation methods for the gradient and Hessian:
forward: Standard forward-difference numerical differentiation.
central: Central-difference (more accurate but slower).
complex: Complex-step differentiation (highly accurate for gradients).
richardson: Richardson extrapolation via the numDeriv package.
A list containing optimization results and iteration metadata.
quad <- function(x) (x[1] - 2)^2 + (x[2] + 1)^2
res <- modified_newton(start = c(0, 0), objective = quad)
print(res$par)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.