mlr_optimizers_async_successive_halving | R Documentation |
OptimizerAsyncSuccessiveHalving
class that implements the Asynchronous Successive Halving Algorithm (ASHA).
This class implements the asynchronous version of OptimizerBatchSuccessiveHalving.
This bbotk::Optimizer can be instantiated via the dictionary
bbotk::mlr_optimizers or with the associated sugar function bbotk::opt()
:
mlr_optimizers$get("async_successive_halving") opt("async_successive_halving")
eta
numeric(1)
With every stage, the budget is increased by a factor of eta
and only the best 1 / eta
configurations are promoted to the next stage.
Non-integer values are supported, but eta
is not allowed to be less or equal to 1.
sampler
paradox::Sampler
Object defining how the samples of the parameter space should be drawn.
The default is uniform sampling.
The bbotk::Archive holds the following additional columns that are specific to SHA:
stage
(integer(1))
Stage index. Starts counting at 0.
asha_id
(character(1))
Unique identifier for each configuration across stages.
Hyperband supports custom paradox::Sampler object for initial configurations in each bracket. A custom sampler may look like this (the full example is given in the examples section):
# - beta distribution with alpha = 2 and beta = 5 # - categorical distribution with custom probabilities sampler = SamplerJointIndep$new(list( Sampler1DRfun$new(params[[2]], function(n) rbeta(n, 2, 5)), Sampler1DCateg$new(params[[3]], prob = c(0.2, 0.3, 0.5)) ))
bbotk::Optimizer
-> bbotk::OptimizerAsync
-> OptimizerAsyncSuccessiveHalving
new()
Creates a new instance of this R6 class.
OptimizerAsyncSuccessiveHalving$new()
optimize()
Performs the optimization on a OptimInstanceAsyncSingleCrit or OptimInstanceAsyncMultiCrit until termination. The single evaluations will be written into the ArchiveAsync. The result will be written into the instance object.
OptimizerAsyncSuccessiveHalving$optimize(inst)
inst
(OptimInstanceAsyncSingleCrit | OptimInstanceAsyncMultiCrit).
data.table::data.table()
clone()
The objects of this class are cloneable with this method.
OptimizerAsyncSuccessiveHalving$clone(deep = FALSE)
deep
Whether to make a deep clone.
Li L, Jamieson K, Rostamizadeh A, Gonina E, Ben-tzur J, Hardt M, Recht B, Talwalkar A (2020). “A System for Massively Parallel Hyperparameter Tuning.” In Dhillon I, Papailiopoulos D, Sze V (eds.), Proceedings of Machine Learning and Systems, volume 2, 230–246. https://proceedings.mlsys.org/paper_files/paper/2020/hash/a06f20b349c6cf09a6b171c71b88bbfc-Abstract.html.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.