selectKerasOptimizer: Select keras optimizer

View source: R/kerasOptimizer.R

selectKerasOptimizerR Documentation

Select keras optimizer

Description

Select one of the following optimizers: "SDG", "RMSPROP", "ADAGRAD", "ADADELTA", "ADAM", "ADAMAX", "NADAM".

Usage

selectKerasOptimizer(
  optimizer,
  learning_rate = 0.01,
  momentum = 0,
  decay = 0,
  nesterov = FALSE,
  clipnorm = NULL,
  clipvalue = NULL,
  rho = 0.9,
  epsilon = NULL,
  beta_1 = 0.9,
  beta_2 = 0.999,
  amsgrad = FALSE,
  ...
)

Arguments

optimizer

integer specifying the algorithm. Can be one of the following: 1=SDG, 2=RMSPROP, 3=ADAGRAD, 4=ADADELTA, 5=ADAM, 6=ADAMAX, or 7=NADAM.

## SGD:

learning_rate

float >= 0. Learning rate.

momentum

float >= 0. Parameter that accelerates SGD in the relevant direction and dampens oscillations.

decay

float >= 0. Learning rate decay over each update.

nesterov

boolean. Whether to apply Nesterov momentum.

clipnorm

Gradients will be clipped when their L2 norm exceeds this value.

clipvalue

Gradients will be clipped when their absolute value exceeds this value.

### RMS:

rho

float >= 0. Decay factor.

epsilon

float >= 0. Fuzz factor. If 'NULL', defaults to 'k_epsilon()'.

### ADAM:

beta_1

The exponential decay rate for the 1st moment estimates. float, 0 < beta < 1. Generally close to 1.

beta_2

The exponential decay rate for the 2nd moment estimates. float, 0 < beta < 1. Generally close to 1.

amsgrad

Whether to apply the AMSGrad variant of this algorithm from the paper "On the Convergence of Adam and Beyond".

...

Unused, present only for backwards compatability

Value

Optimizer for use with compile.keras.engine.training.Model.


SPOTMisc documentation built on Sept. 5, 2022, 5:06 p.m.