Description Usage
View source: R/attention-layers.R
R create_layer wrapper for keras LayerNormalization()
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
layer_normalization( object, axis = -1L, epsilon = 0.001, center = TRUE, scale = TRUE, beta_initializer = "zeros", gamma_initializer = "ones", beta_regularizer = NULL, gamma_regularizer = NULL, beta_constraint = NULL, gamma_constraint = NULL, trainable = TRUE, name = NULL )
Add the following code to your website.
REMOVE THIS Copy to clipboard
For more information on customizing the embed code, read Embedding Snippets.