StoSOO: StoSOO and SOO algorithms

View source: R/StoSOO.R

StoSOOR Documentation

StoSOO and SOO algorithms

Description

Global optimization of a blackbox function given a finite budget of noisy evaluations, via the Stochastic-Simultaneous Optimistic Optimisation algorithm. The deterministic-SOO method is available for noiseless observations. The knowledge of the function's smoothness is not required.

Usage

StoSOO(
  par,
  fn,
  ...,
  lower = rep(0, length(par)),
  upper = rep(1, length(par)),
  nb_iter,
  control = list(verbose = 0, type = "sto", max = FALSE, light = TRUE)
)

Arguments

par

vector with length defining the dimensionality of the optimization problem. Providing actual values of par is not necessary (NAs are just fine). Included primarily for compatibility with optim.

fn

scalar function to be minimized, with first argument to be optimized over.

...

optional additional arguments to fn.

lower, upper

vectors of bounds on the variables.

nb_iter

number of function evaluations allocated to optimization.

control

list of control parameters:

  • verbose: verbosity level, either 0 (default), 1 or greater than 1,

  • type: either 'det' for optimizing a deterministic function or 'sto' for a stochastic one,

  • k_max: maximum number of evaluations per leaf (default: from analysis),

  • h_max: maximum depth of the tree (default: from analysis),

  • delta: confidence (default: 1/sqrt(nb_iter) - from analysis),

  • light: set to FALSE to return the search tree,

  • max: if TRUE, performs maximization.

Details

The optional tree element returned is a list, whose first element is the root node and the last element the deepest nodes. A each level, x provides the center(s), one per row, whose corresponding bounds are given by x_min and x_max. Then:

  • leaf indicates if x is a leaf (1 if TRUE);

  • new indicates if x has been sampled last;

  • sums gives the sum of values at x;

  • bs is for the upper bounds at x;

  • ks is the number of evaluations at x;

  • values stores the values evaluated as they come (mostly useful in the deterministic case)

Value

list with components:

  • par best set of parameters (for a stochastic function, it corresponds to the minimum reached over the deepest unexpanded node),

  • value value of fn at par,

  • tree search tree built during the execution, not returned unless control$light == TRUE.

Author(s)

M. Binois (translation in R code), M. Valko, A. Carpentier, R. Munos (Matlab code)

References

R. Munos (2011), Optimistic optimization of deterministic functions without the knowledge of its smoothness, NIPS, 783-791.

M. Valko, A. Carpentier and R. Munos (2013), Stochastic Simultaneous Optimistic Optimization, ICML, 19-27 https://inria.hal.science/hal-00789606. Matlab code: https://team.inria.fr/sequel/software/StoSOO/.

P. Preux, R. Munos, M. Valko (2014), Bandits attack function optimization, IEEE Congress on Evolutionary Computation (CEC), 2245-2252.

Examples

#------------------------------------------------------------
# Example 1 : Deterministic optimization with SOO
#------------------------------------------------------------
## Define objective
fun1 <- function(x) return(-guirland(x))

## Optimization
Sol1 <- StoSOO(par = NA, fn = fun1, nb_iter = 1000, control = list(type = "det", verbose = 1))

## Display objective function and solution found
curve(fun1, n = 1001)
abline(v = Sol1$par, col = 'red')

#------------------------------------------------------------
# Example 2 : Stochastic optimization with StoSOO
#------------------------------------------------------------
set.seed(42)

## Same objective function with uniform noise
fun2 <- function(x){return(fun1(x) + runif(1, min = -0.1, max = 0.1))}

## Optimization
Sol2 <- StoSOO(par = NA, fn = fun2, nb_iter = 1000, control = list(type = "sto", verbose = 1))

## Display solution
abline(v = Sol2$par, col = 'blue')

#------------------------------------------------------------
# Example 3 : Stochastic optimization with StoSOO, 2-dimensional function
#------------------------------------------------------------

set.seed(42)

## 2-dimensional noisy objective function, defined on [0, pi/4]^2
fun3 <- function(x){return(-sin1(x[1]) * sin1(1 - x[2]) + runif(1, min = -0.05, max = 0.05))}

## Optimizing
Sol3 <- StoSOO(par = rep(NA, 2), fn = fun3, upper = rep(pi/4, 2), nb_iter = 1000)

## Display solution
xgrid <- seq(0, pi/4, length.out = 101)
Xgrid <- expand.grid(xgrid, xgrid)
ref <- apply(Xgrid, 1, function(x){(-sin1(x[1]) * sin1(1 - x[2]))})
filled.contour(xgrid, xgrid, matrix(ref, 101), color.palette  = terrain.colors,
plot.axes = {axis(1); axis(2); points(Xgrid[which.min(ref),, drop = FALSE], pch = 21);
             points(Sol3$par[1],Sol3$par[2], pch = 13)})


#------------------------------------------------------------
# Example 4 : Deterministic optimization with StoSOO, 2-dimensional function with plots
#------------------------------------------------------------
set.seed(42)

## 2-dimensional noiseless objective function, defined on [0, pi/4]^2
fun4 <- function(x){return(-sin1(x[1]) * sin1(1 - x[2]))}

## Optimizing
Sol4 <- StoSOO(par = rep(NA, 2), fn = fun4, upper = rep(pi/4, 2), nb_iter = 1000,
  control = list(type = 'det', light = FALSE))

## Display solution
xgrid <- seq(0, pi/4, length.out = 101)
Xgrid <- expand.grid(xgrid, xgrid)
ref <- apply(Xgrid, 1, function(x){(-sin1(x[1]) * sin1(1 - x[2]))})
filled.contour(xgrid, xgrid, matrix(ref, 101), color.palette  = terrain.colors,
plot.axes = {axis(1); axis(2); plotStoSOO(Sol4, add = TRUE, upper = rep(pi/4, 2));
             points(Xgrid[which.min(ref),, drop = FALSE], pch = 21);
             points(Sol4$par[1], Sol4$par[2], pch = 13, col = 2)})

OOR documentation built on Aug. 23, 2023, 1:08 a.m.

Related to StoSOO in OOR...