Nothing
store_benchmark_result = TRUE
if store_models = TRUE
when creating a tuning instance.tune_nested()
did not work.$phash()
method to AutoTuner
.Tuner
in hash of AutoTuner
.method
parameter of tune()
, tune_nested()
and auto_tuner()
is renamed to tuner
.
Only Tuner
objects are accepted now.
Arguments to the tuner cannot be passed with ...
anymore.tuner
parameter of AutoTuner
is moved to the first position to achieve consistency with the other functions.allow_hotstarting
, keep_hotstart_stack
and keep_models
flags to AutoTuner
and auto_tuner()
.AutoTuner
accepts instantiated resamplings now.
The AutoTuner
checks if all row ids of the inner resampling are present in the outer resampling train set when nested resampling is performed.Tuner
did not create a ContextOptimization
.ti()
function did not accept callbacks.$importance()
, $selected_features()
, $oob_error()
and $loglik()
are forwarded from the final model to the AutoTuner
now.AutoTuner
stores the instance and benchmark result if store_models = TRUE
.AutoTuner
stores the instance if store_benchmark_result = TRUE
.mlr_callbacks
.callback_tuning()
function.AutoTuner
did not accept TuningSpace
objects as search spaces.ti()
function to create a TuningInstanceSingleCrit
or TuningInstanceMultiCrit
.extract_inner_tuning_results()
to return the tuning instances.evaluate_default
to evaluate learners with hyperparameters set to their default values.smooth
is FALSE
for TunerGenSA
.Tuner
objects have the field $id
now.Tuner
objects as method
in tune()
and auto_tuner()
.Tuner
to help page of bbotk::Optimizer
.Tuner
objects have the optional field $label
now.as.data.table()
functions for objects of class Dictionary
have been extended with additional columns.as.data.table.DictionaryTuner
function.$help()
method which opens the manual page of a Tuner
.as_search_space()
function to create search spaces from Learner
and ParamSet
objects.
Allow to pass TuningSpace
objects as search_space
in TuningInstanceSingleCrit
and TuningInstanceMultiCrit
.mlr3::HotstartStack
can now be removed after tuning with the keep_hotstart_stack
flag.Archive
stores errors and warnings of the learners.auto_tuner()
and tune_nested()
.$assign_result()
method in TuningInstanceSingleCrit
when search space is empty.TuningInstanceSingleCrit
.TuningInstanceMultiCrit$assign_result()
.store_models
flag to auto_tuner()
."noisy"
property to ObjectiveTuning
.AutoTuner$base_learner()
method to extract the base learner from
nested learner objects.tune()
supports multi-criteria tuning.TunerIrace
from irace
package.extract_inner_tuning_archives()
helper function to extract inner tuning
archives.ArchiveTuning$extended_archive()
method. The mlr3::ResampleResults
are joined automatically by as.data.table.TuningArchive()
and
extract_inner_tuning_archives()
.tune()
, auto_tuner()
and tune_nested()
sugar functions.TuningInstanceSingleCrit
, TuningInstanceMultiCrit
and AutoTuner
can be
initialized with store_benchmark_result = FALSE
and store_models = TRUE
to allow measures to access the models.TuningInstance*$assign_result()
errors with required parameter bug.$learner()
, $learners()
, $learner_param_vals()
,
$predictions()
and $resample_result()
from benchmark result in archive.extract_inner_tuning_results()
helper function to extract inner tuning
results.ArchiveTuning$data
is a public field now.TunerCmaes
from adagio
package.predict_type
in AutoTuner
.TuneToken
in Learner$param_set
and create a search space
from it.TuningInstanceSingleCrit
and
TuningInstanceSingleCrit
changed.store_benchmark_result
, store_models
and check_values
in AutoTuner
. store_tuning_instance
must be set as a parameter during
initialization.check_values
flag in TuningInstanceSingleCrit
and
TuningInstanceMultiCrit
.bibtex
.saveRDS()
, serialize()
etc.Archive
is ArchiveTuning
now which stores the benchmark result in
$benchmark_result
. This change removed the resample results from the archive
but they can be still accessed via the benchmark result.as.data.table(rr)$learner[[1]]$tuning_result
must be used now.TuningInstance
is now TuningInstanceSingleCrit
. TuningInstanceMultiCrit
is still available for multi-criteria tuning.trm()
and trms()
instead of term()
and
terms()
.store_resample_result
flag in TuningInstanceSingleCrit
and
TuningInstanceMultiCrit
TunerNLoptr
adds non-linear optimization from the nloptr package.bbotk
logger now.check_values
flag in TuningInstanceSingleCrit
and
TuningInstanceMultiCrit
.bbotk
package for basic tuning objects.
Terminator
classes now live in bbotk
. As a consequence ObjectiveTuning
inherits from bbotk::Objective
, TuningInstance
from bbotk::OptimInstance
and Tuner
from bbotk::Optimizer
TuningInstance$param_set
becomes TuningInstance$search_space
to avoid
confusion as the param_set
usually contains the parameters that change the
behavior of an object.$optimize()
instead of $tune()
AutoTuner
where a $clone()
was missing. Tuning results are
unaffected, only stored models contained wrong hyperparameter values (#223).Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.