mlr_tuners_cmaes | R Documentation |
Subclass for Covariance Matrix Adaptation Evolution Strategy (CMA-ES).
Calls adagio::pureCMAES()
from package adagio.
This Tuner can be instantiated with the associated sugar function tnr()
:
tnr("cmaes")
start_values
character(1)
Create random
start values or based on center
of search space?
In the latter case, it is the center of the parameters before a trafo is applied.
For the meaning of the control parameters, see adagio::pureCMAES()
.
Note that we have removed all control parameters which refer to the termination of the algorithm and where our terminators allow to obtain the same behavior.
$optimize()
supports progress bars via the package progressr
combined with a Terminator. Simply wrap the function in
progressr::with_progress()
to enable them. We recommend to use package
progress as backend; enable with progressr::handlers("progress")
.
All Tuners use a logger (as implemented in lgr) from package
bbotk.
Use lgr::get_logger("bbotk")
to access and control the logger.
This Tuner is based on bbotk::OptimizerCmaes which can be applied on any black box optimization problem. See also the documentation of bbotk.
There are several sections about hyperparameter optimization in the mlr3book.
Learn more about tuners.
The gallery features a collection of case studies and demos about optimization.
Use the Hyperband optimizer with different budget parameters.
mlr3tuning::Tuner
-> mlr3tuning::TunerFromOptimizer
-> TunerCmaes
new()
Creates a new instance of this R6 class.
TunerCmaes$new()
clone()
The objects of this class are cloneable with this method.
TunerCmaes$clone(deep = FALSE)
deep
Whether to make a deep clone.
Hansen N (2016). “The CMA Evolution Strategy: A Tutorial.” 1604.00772.
Other Tuner:
mlr_tuners_design_points
,
mlr_tuners_gensa
,
mlr_tuners_grid_search
,
mlr_tuners_irace
,
mlr_tuners_nloptr
,
mlr_tuners_random_search
,
mlr_tuners
# Hyperparameter Optimization
# load learner and set search space
learner = lrn("classif.rpart",
cp = to_tune(1e-04, 1e-1, logscale = TRUE),
minsplit = to_tune(p_dbl(2, 128, trafo = as.integer)),
minbucket = to_tune(p_dbl(1, 64, trafo = as.integer))
)
# run hyperparameter tuning on the Palmer Penguins data set
instance = tune(
tuner = tnr("cmaes"),
task = tsk("penguins"),
learner = learner,
resampling = rsmp("holdout"),
measure = msr("classif.ce"),
term_evals = 10)
# best performing hyperparameter configuration
instance$result
# all evaluated hyperparameter configuration
as.data.table(instance$archive)
# fit final model on complete data set
learner$param_set$values = instance$result_learner_param_vals
learner$train(tsk("penguins"))
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.