Introduction to optimflex"

knitr::opts_chunk$set(
  collapse = TRUE,
  comment = "#>"
)
library(optimflex)

Introduction

The optimflex package provides a highly flexible suite of non-linear optimization algorithms designed for robustness and numerical precision. It is particularly suited for complex models—such as those found in Structural Equation Modeling (SEM)—where convergence stability and verification are paramount.

Rigorous Convergence Control

A defining feature of optimflex is its rigorous convergence control. Instead of relying on a single, hard-coded stopping rule, optimflex allows users to select and combine up to eight distinct convergence criteria.

When multiple criteria are enabled (by setting their respective use_ flags to TRUE in the control list), the package applies a strict "AND" rule: all chosen conditions must be satisfied simultaneously before the algorithm declares success. This multi-faceted approach ensures that the solution is stable from various numerical perspectives.

Convergence Criteria and Formulas

Each criterion is managed via a logical flag (use_*) and a corresponding tolerance parameter (tol_*) within the control list.

1. Function Value Changes

These monitor the stability of the objective function $f$.

2. Parameter Space Changes

These ensure that the parameter vector $x$ has stabilized.

3. Gradient Norm

The standard measure of stationarity (first-order optimality).

4. Hessian Verification

5. Model-based Predicted Decrease

These check if the quadratic model of the objective function suggests significant further improvement is possible.


Basic Usage

The following example demonstrates how to minimize a simple quadratic function using the BFGS algorithm with customized convergence criteria.

# Define a simple objective function
quad_func <- function(x) {
  (x[1] - 5)^2 + (x[2] + 3)^2
}

# Run optimization
res <- bfgs(
  start = c(0, 0),
  objective = quad_func,
  control = list(
    use_grad = TRUE,
    tol_grad = 1e-6,
    use_rel_x = TRUE
  )
)

# Inspect results
res$par
res$converged

Algorithm Comparison: The Rosenbrock Function

optimflex shines when dealing with difficult landscapes like the Rosenbrock "banana" function. You can easily compare how different algorithms (e.g., Quasi-Newton vs. Trust-Region) navigate the narrow valley.

rosenbrock <- function(x) {
  100 * (x[2] - x[1]^2)^2 + (1 - x[1])^2
}

start_val <- c(-1.2, 1.0)

# Compare Double Dogleg
res_dd  <- double_dogleg(start_val, rosenbrock, control = list(initial_delta = 2.0))

cat("Double Dogleg Iterations:", res_dd$iter, "\n")


Try the optimflex package in your browser

Any scripts or data that you put into this service are public.

optimflex documentation built on April 11, 2026, 5:06 p.m.