mlr_tuners_cmaes: Hyperparameter Tuning with Covariance Matrix Adaptation...

mlr_tuners_cmaesR Documentation

Hyperparameter Tuning with Covariance Matrix Adaptation Evolution Strategy

Description

Subclass for Covariance Matrix Adaptation Evolution Strategy (CMA-ES). Calls adagio::pureCMAES() from package adagio.

Dictionary

This Tuner can be instantiated with the associated sugar function tnr():

tnr("cmaes")

Control Parameters

start_values

character(1)
Create random start values or based on center of search space? In the latter case, it is the center of the parameters before a trafo is applied.

For the meaning of the control parameters, see adagio::pureCMAES(). Note that we have removed all control parameters which refer to the termination of the algorithm and where our terminators allow to obtain the same behavior.

Progress Bars

⁠$optimize()⁠ supports progress bars via the package progressr combined with a Terminator. Simply wrap the function in progressr::with_progress() to enable them. We recommend to use package progress as backend; enable with progressr::handlers("progress").

Logging

All Tuners use a logger (as implemented in lgr) from package bbotk. Use lgr::get_logger("bbotk") to access and control the logger.

Optimizer

This Tuner is based on bbotk::OptimizerCmaes which can be applied on any black box optimization problem. See also the documentation of bbotk.

Resources

There are several sections about hyperparameter optimization in the mlr3book.

The gallery features a collection of case studies and demos about optimization.

  • Use the Hyperband optimizer with different budget parameters.

Super classes

mlr3tuning::Tuner -> mlr3tuning::TunerFromOptimizer -> TunerCmaes

Methods

Public methods

Inherited methods

Method new()

Creates a new instance of this R6 class.

Usage
TunerCmaes$new()

Method clone()

The objects of this class are cloneable with this method.

Usage
TunerCmaes$clone(deep = FALSE)
Arguments
deep

Whether to make a deep clone.

Source

Hansen N (2016). “The CMA Evolution Strategy: A Tutorial.” 1604.00772.

See Also

Other Tuner: mlr_tuners_design_points, mlr_tuners_gensa, mlr_tuners_grid_search, mlr_tuners_irace, mlr_tuners_nloptr, mlr_tuners_random_search, mlr_tuners

Examples

# Hyperparameter Optimization

# load learner and set search space
learner = lrn("classif.rpart",
  cp = to_tune(1e-04, 1e-1, logscale = TRUE),
  minsplit = to_tune(p_dbl(2, 128, trafo = as.integer)),
  minbucket = to_tune(p_dbl(1, 64, trafo = as.integer))
)

# run hyperparameter tuning on the Palmer Penguins data set
instance = tune(
  tuner = tnr("cmaes"),
  task = tsk("penguins"),
  learner = learner,
  resampling = rsmp("holdout"),
  measure = msr("classif.ce"),
  term_evals = 10)

# best performing hyperparameter configuration
instance$result

# all evaluated hyperparameter configuration
as.data.table(instance$archive)

# fit final model on complete data set
learner$param_set$values = instance$result_learner_param_vals
learner$train(tsk("penguins"))

mlr3tuning documentation built on Nov. 21, 2023, 1:06 a.m.