mlr_callback_set.lr_scheduler: Learning Rate Scheduling Callback

mlr_callback_set.lr_schedulerR Documentation

Learning Rate Scheduling Callback

Description

Changes the learning rate based on the schedule specified by a torch::lr_scheduler.

As of this writing, the following are available:

  • torch::lr_cosine_annealing()

  • torch::lr_lambda()

  • torch::lr_multiplicative()

  • torch::lr_one_cycle()

  • torch::lr_reduce_on_plateau()

  • torch::lr_step()

  • Custom schedulers defined with torch::lr_scheduler().

Super class

mlr3torch::CallbackSet -> CallbackSetLRScheduler

Public fields

scheduler_fn

(lr_scheduler_generator)
The torch function that creates a learning rate scheduler

scheduler

(LRScheduler)
The learning rate scheduler wrapped by this callback

Methods

Public methods

Inherited methods

Method new()

Creates a new instance of this R6 class.

Usage
CallbackSetLRScheduler$new(.scheduler, step_on_epoch, ...)
Arguments
.scheduler

(lr_scheduler_generator)
The torch scheduler generator (e.g. torch::lr_step).

step_on_epoch

(logical(1))
Whether the scheduler steps after every epoch (otherwise every batch).

...

(any)
The scheduler-specific arguments


Method on_begin()

Creates the scheduler using the optimizer from the context

Usage
CallbackSetLRScheduler$on_begin()

Method clone()

The objects of this class are cloneable with this method.

Usage
CallbackSetLRScheduler$clone(deep = FALSE)
Arguments
deep

Whether to make a deep clone.


mlr3torch documentation built on April 4, 2025, 3:03 a.m.