Description Usage Arguments Details Value Author(s) References See Also Examples
Random search optimization method with systematic component that searches the global optimum. The loss function is allowed to be non-linear, non-differentiable and multimodal. Undefined responses are allowed as well.
1 2 |
fun |
Loss function to be optimized. It must return a scalar value. The variables must be assigned as a vector. See 'details'. |
start |
Vector of initial values for the function variables. Must be of same length as the variables vector of the loss function. The response of the initial variables combination must be defined (NA or NaN responses are not allowed). |
maximization |
Logical. Default is FALSE. |
trace |
Logical. If TRUE, interim results are stored. Necessary for the plot function. Default is FALSE. |
lower |
Vector of lower boundaries for the function variables. Must be of same length as the variables vector of the function. |
upper |
Vector of upper boundaries for the function variables. Must be of same length as the variables vector of the function. |
control |
List with optional further arguments to modify the optimization specifically to the loss function:
|
Simulated Annealing is an optimization algorithm for solving complex functions that may have several optima. The method is composed of a random and a systematic component. Basically, it randomly modifies the variables combination n_limit times to compare their response values. Depending on the temperature and the constant k, there is also a likelihood of choosing variables combinations with worse response. There is thus a time-decreasing likelihood of leaving local optima. The Simulated Annealing Optimization method is therefore advantageous for multimodal functions. Undefined response values (NA) are allowed as well. This can be useful for loss functions with variables restrictions. The high number of parameters allows a very flexible parameterization. optim_sa is able to solve mathematical formulas as well as complex rule sets.
The performance therefore highly depends on the settings. It is indispensable to parameterize the algorithm carefully. The control list is pre-parameterized for loss functions of medium complexity. To improve the performance, the settings should be changed when solving relatively simple functions (e. g. three dimensional multimodal functions). For complex functions the settings should be changed to improve the accuracy. Most important parameters are nlimit, r and t0.
The dynamic rf adjustment depends on the number of loss function calls which are out of the variables boundaries as well as the temperature of the current iteration. The obligatory decreasing rf ensures a relatively wide search grid at the beginning of the optimization process that shrinks over time. It thus automatically adjusts for the trade-off between range of the search grid and accuracy. See Pronzato (1984) for more details. It is sometimes useful to disable the dynamic rf changing when the most performant rf are known. As dyn_rf usually improves the performance as well as the accuracy, the default is TRUE.
The output is a nmsa_optim list object with following entries:
parFunction variables after optimization.
function_valueLoss function response after optimization.
traceMatrix with interim results. NULL if trace was not activated.
funThe loss function.
startThe initial function variables.
lowerThe lower boundaries of the function variables.
upperThe upper boundaries of the function variables.
controlControl arguments, see 'details'.
Kai Husmann
Corana, A., Marchesi, M., Martini, C. and Ridella, S. (1987), Minimizing Multimodal Functions of Continuous Variables with the 'Simulated Annealing' Algorithm. ACM Transactions on Mathematical Software, 13(3):262-280.
Kirkpatrick, S., Gelatt, C. D. and Vecchi, M. P. (1983). Optimization by Simulated Annealing. Science, 220(4598):671-680.
Pronzato, L., Walter, E., Venot, A. and Lebruchec, J.-F. (1984). A general-purpose global optimizer: Implementation and applications. Mathematics and Computers in Simulation, 26(5):412-422.
optim_nm, optim, plot.optim_nmsa
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 | ##### Rosenbrock function
# minimum at f(1,1) = 0
ro <- function(x){
100*(x[2]-x[1]^2)^2+(1-x[1])^2
}
# Random start values. Example arguments for the relatively simple Rosenbrock function.
ro_sa <- optim_sa(fun = ro,
start = c(runif(2, min = -1, max = 1)),
lower = c(-5, -5),
upper = c(5, 5),
trace = TRUE,
control = list(t0 = 100,
nlimit = 550,
t_min = 0.1,
dyn_rf = FALSE,
rf = 1,
r = 0.7
)
)
# Visual inspection.
plot(ro_sa)
plot(ro_sa, type = "contour")
##### Holder table function
# 4 minima at
#f(8.055, 9.665) = -19.2085
#f(-8.055, 9.665) = -19.2085
#f(8.055, -9.665) = -19.2085
#f(-8.055, -9.665) = -19.2085
ho <- function(x){
x1 <- x[1]
x2 <- x[2]
fact1 <- sin(x1) * cos(x2)
fact2 <- exp(abs(1 - sqrt(x1^2 + x2^2) / pi))
y <- -abs(fact1 * fact2)
}
# Random start values. Example arguments for the relatively complex Holder table function.
optim_sa(fun = ho,
start = c(1, 1),
lower = c(-10, -10),
upper = c(10, 10),
trace = TRUE,
control = list(dyn_rf = FALSE,
rf = 1.6,
t0 = 10,
nlimit = 200,
r = 0.6,
t_min = 0.1
)
)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.