AdamWeightDecayOptimizer: Constructor for objects of class AdamWeightDecayOptimizer

View source: R/optimization.R

AdamWeightDecayOptimizerR Documentation

Constructor for objects of class AdamWeightDecayOptimizer

Description

A basic Adam optimizer that includes "correct" L2 weight decay.

Usage

AdamWeightDecayOptimizer(
  learning_rate,
  weight_decay_rate = 0,
  beta_1 = 0.9,
  beta_2 = 0.999,
  epsilon = 1e-06,
  exclude_from_weight_decay = NULL,
  name = "AdamWeightDecayOptimizer"
)

Arguments

learning_rate

Numeric Tensor (single element?); learning rate.

weight_decay_rate

Numeric; weight decay rate.

beta_1

Numeric; parameter for Adam.

beta_2

Numeric; parameter for Adam.

epsilon

Numeric; a tiny number to put a cap on update size by avoiding dividing by even smaller numbers.

exclude_from_weight_decay

Character; list of parameter names to exclude from weight decay.

name

Character; the name of the constructed object.

Details

Inherits from class tf.train.Optimizer. https://devdocs.io/tensorflow~python/tf/train/optimizer

Value

An object of class "AdamWeightDecayOptimizer", which is a (hacky) modification of the tf.train.Optimizer class.

Examples

## Not run: 
with(tensorflow::tf$variable_scope("examples",
  reuse = tensorflow::tf$AUTO_REUSE
), {
  optimizer <- AdamWeightDecayOptimizer(learning_rate = 0.01)
})

## End(Not run)

jonathanbratt/RBERT documentation built on Jan. 26, 2023, 4:15 p.m.