layer_weight_normalization: Weight Normalization layer

Description Usage Arguments Details Value Examples

View source: R/layers.R

Description

Weight Normalization layer

Usage

1
layer_weight_normalization(object, layer, data_init = TRUE, ...)

Arguments

object

Model or layer object

layer

a layer instance.

data_init

If 'TRUE' use data dependent variable initialization

...

additional parameters to pass

Details

This wrapper reparameterizes a layer by decoupling the weight's magnitude and direction. This speeds up convergence by improving the conditioning of the optimization problem. Weight Normalization: A Simple Reparameterization to Accelerate Training of Deep Neural Networks: https://arxiv.org/abs/1602.07868 Tim Salimans, Diederik P. Kingma (2016) WeightNormalization wrapper works for keras and tf layers.

Value

A tensor

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
## Not run: 

model= keras_model_sequential() %>%
layer_weight_normalization(
layer_conv_2d(filters = 2, kernel_size = 2, activation = 'relu'),
input_shape = c(32L, 32L, 3L))
model



## End(Not run)

tfaddons documentation built on July 2, 2020, 2:12 a.m.