create_optimizer: Create an optimizer training op

View source: R/optimization.R

create_optimizerR Documentation

Create an optimizer training op

Description

create_optimizer doesn't actually return the optimizer object; it returns the operation resulting from a tf.group() call.

Usage

create_optimizer(loss, init_lr, num_train_steps, num_warmup_steps, use_tpu)

Arguments

loss

Float Tensor; the loss for this step (calculated elsewhere; in principle is a function of trainable parameter values).

init_lr

Numeric; initial learning rate.

num_train_steps

Integer; number of steps to train for.

num_warmup_steps

Integer; number of steps to use for "warm-up".

use_tpu

Logical; whether to use TPU.

Details

See also:

https://www.tensorflow.org/api_docs/python/tf/group

https://stackoverflow.com/questions/41780655/what-is-the-difference-between-tf-group-and-tf-control-dependencies

The routine tf.gradients() is called in the course of this function. https://www.tensorflow.org/api_docs/python/tf/gradients

Value

A training op: the result of a tensorflow group() of operations.

Examples

## Not run: 
with(tensorflow::tf$variable_scope("examples",
  reuse = tensorflow::tf$AUTO_REUSE
), {
  totrain <- tensorflow::tf$get_variable(
    "totrain",
    tensorflow::shape(10L, 20L)
  )
  loss <- 2 * totrain

  train_op <- create_optimizer(
    loss = loss,
    init_lr = 0.01,
    num_train_steps = 20L,
    num_warmup_steps = 10L,
    use_tpu = FALSE
  )
})

## End(Not run)

jonathanbratt/RBERT documentation built on Jan. 26, 2023, 4:15 p.m.