Hyperband: Hyperband

View source: R/HyperResNet_HyperXception.R

HyperbandR Documentation

Hyperband

Description

Variation of HyperBand algorithm.

Usage

Hyperband(
  hypermodel,
  optimizer = NULL,
  loss = NULL,
  metrics = NULL,
  hyperparameters = NULL,
  objective,
  max_epochs,
  factor = 3,
  hyperband_iterations = 1,
  seed = NULL,
  tune_new_entries = TRUE,
  allow_new_entries = TRUE,
  distribution_strategy = NULL,
  directory = NULL,
  project_name = NULL,
  ...
)

Arguments

hypermodel

Define a model-building function. It takes an argument "hp" from which you can sample hyperparameters.

optimizer

An optimizer is one of the arguments required for compiling a Keras model

loss

A loss function (or objective function, or optimization score function) is one of the parameters required to compile a model

metrics

A metric is a function that is used to judge the performance of your model

hyperparameters

HyperParameters class instance. Can be used to override (or register in advance) hyperparamters in the search space.

objective

A loss metrics function for tracking the model performance e.g. "val_precision". The name of the objective to optimize (whether to minimize or maximize is automatically inferred for built-in metrics)

max_epochs

to train the model. Note that in conjunction with initial_epoch, epochs is to be understood as "final epoch". The model is not trained for a number of iterations given by epochs, but merely until the epoch of index epochs is reached.

factor

Int. Reduction factor for the number of epochs and number of models for each bracket.

hyperband_iterations

Int >= 1. The number of times to iterate over the full Hyperband algorithm. One iteration will run approximately “'max_epochs * (math.log(max_epochs, factor) ** 2)“' cumulative epochs across all trials. It is recommended to set this to as high a value as is within your resource budget.

seed

Int. Random seed.

tune_new_entries

Whether hyperparameter entries that are requested by the hypermodel but that were not specified in hyperparameters should be added to the search space, or not. If not, then the default value for these parameters will be used.

allow_new_entries

Whether the hypermodel is allowed to request hyperparameter entries not listed in 'hyperparameters'. **kwargs: Keyword arguments relevant to all 'Tuner' subclasses. Please see the docstring for 'Tuner'.

distribution_strategy

Scale up from running single-threaded locally to running on dozens or hundreds of workers in parallel. Distributed Keras Tuner uses a chief-worker model. The chief runs a service to which the workers report results and query for the hyperparameters to try next. The chief should be run on a single-threaded CPU instance (or alternatively as a separate process on one of the workers). Keras Tuner also supports data parallelism via tf.distribute. Data parallelism and distributed tuning can be combined. For example, if you have 10 workers with 4 GPUs on each worker, you can run 10 parallel trials with each trial training on 4 GPUs by using tf.distribute.MirroredStrategy. You can also run each trial on TPUs via tf.distribute.experimental.TPUStrategy. Currently tf.distribute.MultiWorkerMirroredStrategy is not supported, but support for this is on the roadmap.

directory

The dir where training logs are stored

project_name

Detailed logs, checkpoints, etc, in the folder my_dir/helloworld, i.e. directory/project_name.

...

Some additional arguments

Details

Reference: Li, Lisha, and Kevin Jamieson. ["Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization." Journal of Machine Learning Research 18 (2018): 1-52]( http://jmlr.org/papers/v18/16-558.html). # Arguments hypermodel: Instance of HyperModel class (or callable that takes hyperparameters and returns a Model instance). objective: String. Name of model metric to minimize or maximize, e.g. "val_accuracy". max_epochs: Int. The maximum number of epochs to train one model. It is recommended to set this to a value slightly higher than the expected time to convergence for your largest Model, and to use early stopping during training (for example, via 'tf.keras.callbacks.EarlyStopping'). factor: Int. Reduction factor for the number of epochs and number of models for each bracket. hyperband_iterations: Int >= 1. The number of times to iterate over the full Hyperband algorithm. One iteration will run approximately 'max_epochs * (math.log(max_epochs, factor) ** 2)' cumulative epochs across all trials. It is recommended to set this to as high a value as is within your resource budget. seed: Int. Random seed. hyperparameters: HyperParameters class instance. Can be used to override (or register in advance) hyperparamters in the search space. tune_new_entries: Whether hyperparameter entries that are requested by the hypermodel but that were not specified in 'hyperparameters' should be added to the search space, or not. If not, then the default value for these parameters will be used. allow_new_entries: Whether the hypermodel is allowed to request hyperparameter entries not listed in 'hyperparameters'. **kwargs: Keyword arguments relevant to all 'Tuner' subclasses. Please see the docstring for 'Tuner'.

Value

a hyperparameter tuner object Hyperband

Reference

Li, Lisha, and Kevin Jamieson. ["Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization." Journal of Machine Learning Research 18 (2018): 1-52]( http://jmlr.org/papers/v18/16-558.html).


kerastuneR documentation built on March 25, 2022, 9:07 a.m.