mlr_tuners_cmaes: Hyperparameter Tuning with Covariance Matrix Adaptation...

Description Dictionary Parameters Progress Bars Logging Super classes Methods Source See Also Examples

Description

Subclass that implements CMA-ES calling adagio::pureCMAES() from package adagio.

Dictionary

This Tuner can be instantiated via the dictionary mlr_tuners or with the associated sugar function tnr():

1
2
3
TunerCmaes$new()
mlr_tuners$get("cmaes")
tnr("cmaes")

Parameters

sigma

numeric(1)

start_values

character(1)
Create random start values or based on center of search space? In the latter case, it is the center of the parameters before a trafo is applied.

For the meaning of the control parameters, see adagio::pureCMAES(). Note that we have removed all control parameters which refer to the termination of the algorithm and where our terminators allow to obtain the same behavior.

Progress Bars

$optimize() supports progress bars via the package progressr combined with a Terminator. Simply wrap the function in progressr::with_progress() to enable them. We recommend to use package progress as backend; enable with progressr::handlers("progress").

Logging

All Tuners use a logger (as implemented in lgr) from package bbotk. Use lgr::get_logger("bbotk") to access and control the logger.

Super classes

mlr3tuning::Tuner -> mlr3tuning::TunerFromOptimizer -> TunerCmaes

Methods

Public methods

Inherited methods

Method new()

Creates a new instance of this R6 class.

Usage
TunerCmaes$new()

Method clone()

The objects of this class are cloneable with this method.

Usage
TunerCmaes$clone(deep = FALSE)
Arguments
deep

Whether to make a deep clone.

Source

Hansen N (2016). “The CMA Evolution Strategy: A Tutorial.” 1604.00772.

See Also

Package mlr3hyperband for hyperband tuning.

Other Tuner: mlr_tuners_design_points, mlr_tuners_gensa, mlr_tuners_grid_search, mlr_tuners_irace, mlr_tuners_nloptr, mlr_tuners_random_search

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
library(data.table)

# retrieve task
task = tsk("pima")

# load learner and set search space
learner = lrn("classif.rpart", 
  cp = to_tune(1e-04, 1e-1, logscale = TRUE), 
  minsplit = to_tune(p_dbl(2, 128, trafo = as.integer)),
  minbucket = to_tune(p_dbl(1, 64, trafo = as.integer))
)

# hyperparameter tuning on the pima indians diabetes data set
instance = tune(
  method = "cmaes",
  task = task,
  learner = learner,
  resampling = rsmp("holdout"),
  measure = msr("classif.ce"),
  term_evals = 10)

# best performing hyperparameter configuration
instance$result

# all evaluated hyperparameter configuration
as.data.table(instance$archive)

# fit final model on complete data set
learner$param_set$values = instance$result_learner_param_vals
learner$train(task)

mlr3tuning documentation built on Sept. 14, 2021, 9:08 a.m.