Subclass for grid search tuning.
The grid is constructed as a Cartesian product over discretized values per
paradox::generate_design_grid(). If the learner supports
hotstarting, the grid is sorted by the hotstart parameter (see also
mlr3::HotstartStack). If not, the points of the grid are evaluated in a
This Tuner can be instantiated via the dictionary
mlr_tuners or with the associated sugar function
1 2 3
Resolution of the grid, see
Resolution per parameter, named by parameter ID, see
Maximum number of points to try in a batch.
$optimize() supports progress bars via the package progressr
combined with a Terminator. Simply wrap the function in
progressr::with_progress() to enable them. We recommend to use package
progress as backend; enable with
In order to support general termination criteria and parallelization, we
evaluate points in a batch-fashion of size
batch_size. Larger batches mean
we can parallelize more, smaller batches imply a more fine-grained checking
of termination criteria. A batch contains of
E.g., if you set a batch size of 10 points and do a 5-fold cross validation, you can
utilize up to 50 cores.
Parallelization is supported via package future (see
section on parallelization for more details).
Creates a new instance of this R6 class.
The objects of this class are cloneable with this method.
TunerGridSearch$clone(deep = FALSE)
Whether to make a deep clone.
Package mlr3hyperband for hyperband tuning.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
# retrieve task task = tsk("pima") # load learner and set search space learner = lrn("classif.rpart", cp = to_tune(1e-04, 1e-1, logscale = TRUE)) # hyperparameter tuning on the pima indians diabetes data set instance = tune( method = "grid_search", task = task, learner = learner, resampling = rsmp("holdout"), measure = msr("classif.ce"), term_evals = 10 ) # best performing hyperparameter configuration instance$result # all evaluated hyperparameter configuration as.data.table(instance$archive) # fit final model on complete data set learner$param_set$values = instance$result_learner_param_vals learner$train(task)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.