TunerIrace class that implements iterated racing. Calls
from package irace.
This Tuner can be instantiated via the dictionary
mlr_tuners or with the associated sugar function
1 2 3
Number of resampling instances.
For the meaning of all other parameters, see
that we have removed all control parameters which refer to the termination of
the algorithm. Use TerminatorEvals instead. Other terminators do not work
The ArchiveTuning holds the following additional columns:
Step number of race.
Identifies resampling instances across races and steps.
Identifies configurations across races and steps.
The tuning result (
instance$result) is the best performing elite of
the final race. The reported performance is the average performance estimated
on all used instances.
$optimize() supports progress bars via the package progressr
combined with a Terminator. Simply wrap the function in
progressr::with_progress() to enable them. We recommend to use package
progress as backend; enable with
Creates a new instance of this R6 class.
Performs the tuning on a TuningInstanceSingleCrit until termination. The single evaluations and the final results will be written into the ArchiveTuning that resides in the TuningInstanceSingleCrit. The final result is returned.
The objects of this class are cloneable with this method.
TunerIrace$clone(deep = FALSE)
Whether to make a deep clone.
Lopez-Ibanez M, Dubois-Lacoste J, Caceres LP, Birattari M, Stuetzle T (2016). “The irace package: Iterated racing for automatic algorithm configuration.” Operations Research Perspectives, 3, 43–58. doi: 10.1016/j.orp.2016.09.002.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
# retrieve task task = tsk("pima") # load learner and set search space learner = lrn("classif.rpart", cp = to_tune(1e-04, 1e-1, logscale = TRUE)) # hyperparameter tuning on the pima indians diabetes data set instance = tune( method = "irace", task = task, learner = learner, resampling = rsmp("holdout"), measure = msr("classif.ce"), term_evals = 42 ) # best performing hyperparameter configuration instance$result # all evaluated hyperparameter configuration as.data.table(instance$archive) # fit final model on complete data set learner$param_set$values = instance$result_learner_param_vals learner$train(task)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.