create_optimizer | R Documentation |
create_optimizer
doesn't actually return the optimizer object; it
returns the operation resulting from a tf.group() call.
create_optimizer(loss, init_lr, num_train_steps, num_warmup_steps, use_tpu)
loss |
Float Tensor; the loss for this step (calculated elsewhere; in principle is a function of trainable parameter values). |
init_lr |
Numeric; initial learning rate. |
num_train_steps |
Integer; number of steps to train for. |
num_warmup_steps |
Integer; number of steps to use for "warm-up". |
use_tpu |
Logical; whether to use TPU. |
See also:
https://www.tensorflow.org/api_docs/python/tf/group
The routine tf.gradients() is called in the course of this function. https://www.tensorflow.org/api_docs/python/tf/gradients
A training op: the result of a tensorflow group() of operations.
## Not run: with(tensorflow::tf$variable_scope("examples", reuse = tensorflow::tf$AUTO_REUSE ), { totrain <- tensorflow::tf$get_variable( "totrain", tensorflow::shape(10L, 20L) ) loss <- 2 * totrain train_op <- create_optimizer( loss = loss, init_lr = 0.01, num_train_steps = 20L, num_warmup_steps = 10L, use_tpu = FALSE ) }) ## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.