AdamWeightDecayOptimizer | R Documentation |
A basic Adam optimizer that includes "correct" L2 weight decay.
AdamWeightDecayOptimizer( learning_rate, weight_decay_rate = 0, beta_1 = 0.9, beta_2 = 0.999, epsilon = 1e-06, exclude_from_weight_decay = NULL, name = "AdamWeightDecayOptimizer" )
learning_rate |
Numeric Tensor (single element?); learning rate. |
weight_decay_rate |
Numeric; weight decay rate. |
beta_1 |
Numeric; parameter for Adam. |
beta_2 |
Numeric; parameter for Adam. |
epsilon |
Numeric; a tiny number to put a cap on update size by avoiding dividing by even smaller numbers. |
exclude_from_weight_decay |
Character; list of parameter names to exclude from weight decay. |
name |
Character; the name of the constructed object. |
Inherits from class tf.train.Optimizer. https://devdocs.io/tensorflow~python/tf/train/optimizer
An object of class "AdamWeightDecayOptimizer", which is a (hacky) modification of the tf.train.Optimizer class.
## Not run: with(tensorflow::tf$variable_scope("examples", reuse = tensorflow::tf$AUTO_REUSE ), { optimizer <- AdamWeightDecayOptimizer(learning_rate = 0.01) }) ## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.