Description Usage Arguments Details Value Examples
Weight Normalization layer
1 | layer_weight_normalization(object, layer, data_init = TRUE, ...)
|
object |
Model or layer object |
layer |
a layer instance. |
data_init |
If 'TRUE' use data dependent variable initialization |
... |
additional parameters to pass |
This wrapper reparameterizes a layer by decoupling the weight's magnitude and direction. This speeds up convergence by improving the conditioning of the optimization problem. Weight Normalization: A Simple Reparameterization to Accelerate Training of Deep Neural Networks: https://arxiv.org/abs/1602.07868 Tim Salimans, Diederik P. Kingma (2016) WeightNormalization wrapper works for keras and tf layers.
A tensor
1 2 3 4 5 6 7 8 9 10 11 | ## Not run:
model= keras_model_sequential() %>%
layer_weight_normalization(
layer_conv_2d(filters = 2, kernel_size = 2, activation = 'relu'),
input_shape = c(32L, 32L, 3L))
model
## End(Not run)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.