luz_callback_lr_scheduler: Learning rate scheduler callback

luz_callback_lr_schedulerR Documentation

Learning rate scheduler callback

Description

Initializes and runs torch::lr_scheduler()s.

Usage

luz_callback_lr_scheduler(
  lr_scheduler,
  ...,
  call_on = "on_epoch_end",
  opt_name = NULL
)

Arguments

lr_scheduler

A torch::lr_scheduler() that will be initialized with the optimizer and the ... parameters.

...

Additional arguments passed to lr_scheduler together with the optimizers.

call_on

The callback breakpoint that scheduler$step() is called. Default is 'on_epoch_end'. See luz_callback() for more information.

opt_name

name of the optimizer that will be affected by this callback. Should match the name given in set_optimizers. If your module has a single optimizer, opt_name is not used.

Value

A luz_callback() generator.

See Also

Other luz_callbacks: luz_callback_auto_resume(), luz_callback_csv_logger(), luz_callback_early_stopping(), luz_callback_interrupt(), luz_callback_keep_best_model(), luz_callback_metrics(), luz_callback_mixed_precision(), luz_callback_mixup(), luz_callback_model_checkpoint(), luz_callback_profile(), luz_callback_progress(), luz_callback_resume_from_checkpoint(), luz_callback_train_valid(), luz_callback()

Examples

if (torch::torch_is_installed()) {
cb <- luz_callback_lr_scheduler(torch::lr_step, step_size = 30)
}

mlverse/torchlight documentation built on Sept. 19, 2024, 11:22 p.m.