mlr_tuners_irace | R Documentation |
Subclass for iterated racing.
Calls irace::irace()
from package irace.
This Tuner can be instantiated with the associated sugar function tnr()
:
tnr("irace")
n_instances
integer(1)
Number of resampling instances.
For the meaning of all other parameters, see irace::defaultScenario()
. Note
that we have removed all control parameters which refer to the termination of
the algorithm. Use bbotk::TerminatorEvals instead. Other terminators do not work
with TunerIrace
.
The ArchiveBatchTuning holds the following additional columns:
"race"
(integer(1)
)
Race iteration.
"step"
(integer(1)
)
Step number of race.
"instance"
(integer(1)
)
Identifies resampling instances across races and steps.
"configuration"
(integer(1)
)
Identifies configurations across races and steps.
The tuning result (instance$result
) is the best-performing elite of the final race.
The reported performance is the average performance estimated on all used instances.
$optimize()
supports progress bars via the package progressr
combined with a bbotk::Terminator. Simply wrap the function in
progressr::with_progress()
to enable them. We recommend to use package
progress as backend; enable with progressr::handlers("progress")
.
All Tuners use a logger (as implemented in lgr) from package
bbotk.
Use lgr::get_logger("bbotk")
to access and control the logger.
This Tuner is based on bbotk::OptimizerBatchIrace which can be applied on any black box optimization problem. See also the documentation of bbotk.
There are several sections about hyperparameter optimization in the mlr3book.
The gallery features a collection of case studies and demos about optimization.
Use the Hyperband optimizer with different budget parameters.
mlr3tuning::Tuner
-> mlr3tuning::TunerBatch
-> mlr3tuning::TunerBatchFromOptimizerBatch
-> TunerBatchIrace
new()
Creates a new instance of this R6 class.
TunerBatchIrace$new()
optimize()
Performs the tuning on a TuningInstanceBatchSingleCrit until termination. The single evaluations and the final results will be written into the ArchiveBatchTuning that resides in the TuningInstanceBatchSingleCrit. The final result is returned.
TunerBatchIrace$optimize(inst)
inst
(TuningInstanceBatchSingleCrit).
data.table::data.table.
clone()
The objects of this class are cloneable with this method.
TunerBatchIrace$clone(deep = FALSE)
deep
Whether to make a deep clone.
Lopez-Ibanez M, Dubois-Lacoste J, Caceres LP, Birattari M, Stuetzle T (2016). “The irace package: Iterated racing for automatic algorithm configuration.” Operations Research Perspectives, 3, 43–58. \Sexpr[results=rd]{tools:::Rd_expr_doi("https://doi.org/10.1016/j.orp.2016.09.002")}.
Other Tuner:
Tuner
,
mlr_tuners
,
mlr_tuners_cmaes
,
mlr_tuners_design_points
,
mlr_tuners_gensa
,
mlr_tuners_grid_search
,
mlr_tuners_internal
,
mlr_tuners_nloptr
,
mlr_tuners_random_search
# retrieve task
task = tsk("pima")
# load learner and set search space
learner = lrn("classif.rpart", cp = to_tune(1e-04, 1e-1, logscale = TRUE))
# hyperparameter tuning on the pima indians diabetes data set
instance = tune(
tuner = tnr("irace"),
task = task,
learner = learner,
resampling = rsmp("holdout"),
measure = msr("classif.ce"),
term_evals = 42
)
# best performing hyperparameter configuration
instance$result
# all evaluated hyperparameter configuration
as.data.table(instance$archive)
# fit final model on complete data set
learner$param_set$values = instance$result_learner_param_vals
learner$train(task)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.