plot_clr_history: Plot cyclic learning rate callback information

Description Usage Arguments Examples

View source: R/clr_callback_plots.R

Description

Plot cyclic learning rate callback information

Usage

1
2
plot_clr_history(callback_clr, granularity = "epoch",
  trans_y_axis = "identity")

Arguments

callback_clr

An object of class 'CyclicLR'.

granularity

Either "epoch" or "iteration". We advise to use epoch as we find it easier to work with. The plot will look very similar (except for the x-axis scaling) for both options as long as you choosed step_size in new_callback_cyclical_learning_rate() to be more iterations than one epoch has.

trans_y_axis

Value passed to [ggplot2::scale_y_continuous()] as the 'trans' argument. Only supported for 'backend = "ggplot2"'.

backend

Either "base" for base R or "ggplot2".

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
library(keras)
dataset <- dataset_boston_housing()
c(c(train_data, train_targets), c(test_data, test_targets)) %<-% dataset

mean <- apply(train_data, 2, mean)
std <- apply(train_data, 2, sd)
train_data <- scale(train_data, center = mean, scale = std)
test_data <- scale(test_data, center = mean, scale = std)


model <- keras_model_sequential() %>%
  layer_dense(
    units = 64, activation = "relu",
    input_shape = dim(train_data)[[2]]
  ) %>%
  layer_dense(units = 64, activation = "relu") %>%
  layer_dense(units = 1)
model %>% compile(
  optimizer = optimizer_rmsprop(lr = 0.001),
  loss = "mse",
  metrics = c("mae")
)

callback_clr <- new_callback_cyclical_learning_rate(
  step_size = 32,
  base_lr = 0.001,
  max_lr = 0.006,
  gamma = 0.99,
  mode = "exp_range"
)
model %>% fit(
  train_data, train_targets,
  validation_data = list(test_data, test_targets),
  epochs = 10, verbose = 1,
  callbacks = list(callback_clr)
)
callback_clr$history
plot_clr_history(callback_clr, backend = "base")

bradleyboehmke/clr documentation built on Jan. 16, 2020, 12:49 a.m.