mlr_tuners_gensa: Hyperparameter Tuning with Generalized Simulated Annealing

Description Dictionary Parallelization Logging Parameters Progress Bars Super classes Methods Source See Also Examples

Description

Subclass for generalized simulated annealing tuning calling GenSA::GenSA() from package GenSA.

Dictionary

This Tuner can be instantiated via the dictionary mlr_tuners or with the associated sugar function tnr():

1
2
3
TunerGenSA$new()
mlr_tuners$get("gensa")
tnr("gensa")

Parallelization

In order to support general termination criteria and parallelization, we evaluate points in a batch-fashion of size batch_size. Larger batches mean we can parallelize more, smaller batches imply a more fine-grained checking of termination criteria. A batch contains of batch_size times resampling$iters jobs. E.g., if you set a batch size of 10 points and do a 5-fold cross validation, you can utilize up to 50 cores.

Parallelization is supported via package future (see mlr3::benchmark()'s section on parallelization for more details).

Logging

All Tuners use a logger (as implemented in lgr) from package bbotk. Use lgr::get_logger("bbotk") to access and control the logger.

Parameters

smooth

logical(1)

temperature

numeric(1)

acceptance.param

numeric(1)

verbose

logical(1)

trace.mat

logical(1)

For the meaning of the control parameters, see GenSA::GenSA(). Note that we have removed all control parameters which refer to the termination of the algorithm and where our terminators allow to obtain the same behavior.

Progress Bars

$optimize() supports progress bars via the package progressr combined with a Terminator. Simply wrap the function in progressr::with_progress() to enable them. We recommend to use package progress as backend; enable with progressr::handlers("progress").

Super classes

mlr3tuning::Tuner -> mlr3tuning::TunerFromOptimizer -> TunerGenSA

Methods

Public methods

Inherited methods

Method new()

Creates a new instance of this R6 class.

Usage
TunerGenSA$new()

Method clone()

The objects of this class are cloneable with this method.

Usage
TunerGenSA$clone(deep = FALSE)
Arguments
deep

Whether to make a deep clone.

Source

Tsallis C, Stariolo DA (1996). “Generalized simulated annealing.” Physica A: Statistical Mechanics and its Applications, 233(1-2), 395–406. doi: 10.1016/s0378-4371(96)00271-3.

Xiang Y, Gubian S, Suomela B, Hoeng J (2013). “Generalized Simulated Annealing for Global Optimization: The GenSA Package.” The R Journal, 5(1), 13. doi: 10.32614/rj-2013-002.

See Also

Package mlr3hyperband for hyperband tuning.

Other Tuner: mlr_tuners_cmaes, mlr_tuners_design_points, mlr_tuners_grid_search, mlr_tuners_irace, mlr_tuners_nloptr, mlr_tuners_random_search

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
# retrieve task
task = tsk("pima")

# load learner and set search space
learner = lrn("classif.rpart", cp = to_tune(1e-04, 1e-1, logscale = TRUE))

# hyperparameter tuning on the pima indians diabetes data set
instance = tune(
  method = "gensa",
  task = task,
  learner = learner,
  resampling = rsmp("holdout"),
  measure = msr("classif.ce"),
  term_evals = 10
)

# best performing hyperparameter configuration
instance$result

# all evaluated hyperparameter configuration
as.data.table(instance$archive)

# fit final model on complete data set
learner$param_set$values = instance$result_learner_param_vals
learner$train(task)

mlr3tuning documentation built on Sept. 14, 2021, 9:08 a.m.