Optimization Test Functions

Share:

Description

Simple and often used test function defined in higher dimensions and with analytical gradients, especially suited for performance tests. Analytical gradients, where existing, are provided with the gr prefix. The dimension is determined by the length of the input vector.

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11

Arguments

x

numeric vector of a certain length.

Details

Rosenbrock – Rosenbrock's famous valley function from 1960. It can also be regarded as a least-squares problem:

∑_{i=1}^{n-1} (1-x_i)^2 + 100 (x_{i+1}-x_i^2)^2

No. of Vars.: n >= 2
Bounds: -5.12 <= xi <= 5.12
Local minima: at f(-1, 1, ..., 1) for n >= 4
Minimum: 0.0
Solution: xi = 1, i = 1:n

Nesterov – Nesterov's smooth adaptation of Rosenbrock, based on the idea of Chebyshev polynomials. This function is even more difficult to optimize than Rosenbrock's:

(x_1 - 1)^2 / 4 + ∑_{i=1}^{n-1} (1 + x_{i+1} - 2 x_i^2)

No. of Vars.: n >= 2
Bounds: -5.12 <= xi <= 5.12
Local minima: ?
Minimum: 0.0
Solution: xi = 1, i = 1:n

Rastrigin – Rastrigin's function is a famous, non-convex example from 1989 for global optimization. It is a typical example of a multimodal function with many local minima:

10 n + ∑_1^n (x_i^2 - 10 \cos(2 π x_i))

No. of Vars.: n >= 2
Bounds: -5.12 <= xi <= 5.12
Local minima: many
Minimum: 0.0
Solution: xi = 0, i = 1:n

Hald – Hald's function is a typical example of a non-smooth test function, from Hald and Madsen in 1981.

\max_{1 ≤ i ≤ n} \frac{x_1 + x_2 t_i}{1 + x_3 t_i + x_4 t_i^2 + x_5 t_i^3} - \exp(t_i)

where t_i = -1 + (i - 1)/10 for 1 ≤ i ≤ 21.

No. of Vars.: n =5
Bounds: -1 <= xi <= 1
Local minima: ?
Minimum: 0.0001223713
Solution: (0.99987763, 0.25358844, -0.74660757, 0.24520150, -0.03749029)

Shor – Shor's function is another typical example of a non-smooth test function, a benchmark for Shor's R-algorithm.

Value

Returns the values of the test function resp. its gradient at that point. If an analytical gradient is not available, a function computing the gradient numerically will be provided.

References

Search the Internet.

Examples

1
2
3
4
5
6
x <- runif(5)
fnHald(x); grHald(x)

# Compare analytical and numerical gradient
shor_gr <- function(x) adagio:::ns.grad(fnShor, x)    # internal gradient
grShor(x); shor_gr(x)