View source: R/learning_rate_schedules.R
learning_rate_schedule_piecewise_constant_decay | R Documentation |
A LearningRateSchedule that uses a piecewise constant decay schedule
learning_rate_schedule_piecewise_constant_decay(
boundaries,
values,
...,
name = NULL
)
boundaries |
A list of |
values |
A list of |
... |
For backwards and forwards compatibility |
name |
A string. Optional name of the operation. Defaults to 'PiecewiseConstant'. |
The function returns a 1-arg callable to compute the piecewise constant when passed the current optimizer step. This can be useful for changing the learning rate value across different invocations of optimizer functions.
Example: use a learning rate that's 1.0 for the first 100001 steps, 0.5 for the next 10000 steps, and 0.1 for any additional steps.
step <- tf$Variable(0, trainable=FALSE) boundaries <- as.integer(c(100000, 110000)) values <- c(1.0, 0.5, 0.1) learning_rate_fn <- learning_rate_schedule_piecewise_constant_decay( boundaries, values) # Later, whenever we perform an optimization step, we pass in the step. learning_rate <- learning_rate_fn(step)
You can pass this schedule directly into a keras Optimizer
as the learning_rate
.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.