| bfgs | R Documentation |
Implements the damped BFGS Quasi-Newton algorithm with a Strong Wolfe line search for non-linear optimization, specifically tailored for SEM.
bfgs(
start,
objective,
gradient = NULL,
hessian = NULL,
lower = -Inf,
upper = Inf,
control = list(),
...
)
start |
Numeric vector. Starting values for the optimization parameters. |
objective |
Function. The objective function to minimize. |
gradient |
Function (optional). Gradient of the objective function. |
hessian |
Function (optional). Hessian matrix of the objective function. |
lower |
Numeric vector. Lower bounds for box constraints. |
upper |
Numeric vector. Upper bounds for box constraints. |
control |
List. Control parameters including convergence flags:
|
... |
Additional arguments passed to objective, gradient, and Hessian functions. |
bfgs is a Quasi-Newton method that maintains an approximation of the
inverse Hessian matrix. It is widely considered the most robust and
efficient member of the Broyden family of optimization methods.
BFGS vs. DFP:
While both bfgs and dfp update the inverse Hessian using
rank-two formulas, BFGS is generally more tolerant of inaccuracies in the
line search. This implementation uses the Sherman-Morrison formula to
update the inverse Hessian directly, avoiding the need for matrix inversion
at each step.
Strong Wolfe Line Search:
To maintain the positive definiteness of the Hessian approximation and
ensure global convergence, this algorithm employs a Strong Wolfe line search.
This search identifies a step length \alpha that satisfies both sufficient
decrease (Armijo condition) and the curvature condition.
Damping for Non-Convexity:
In Structural Equation Modeling (SEM), objective functions often exhibit
non-convex regions. When use_damped = TRUE, Powell's damping
strategy is applied to the update vectors to preserve the positive
definiteness of the Hessian approximation even when the curvature condition
is not naturally met.
A list containing optimization results and iteration metadata.
Nocedal, J., & Wright, S. J. (2006). Numerical Optimization. Springer.
Fletcher, R. (1987). Practical Methods of Optimization. Wiley.
quad <- function(x) (x[1] - 2)^2 + (x[2] + 1)^2
res <- bfgs(start = c(0, 0), objective = quad)
print(res$par)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.