layer_glu: Gated Linear Unit

Description Usage References

View source: R/glu.R

Description

Computes hidden layer h_l as: h_l(X) = (X ∗ W + b) ⊗ σ(X ∗ V + c)

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
layer_glu(
  object,
  filters = 32,
  kernel_size = 3,
  kernel_initializer = "glorot_normal",
  kernel_regularizer = NULL,
  bias_initializer = "zeros",
  bias_regularizer = NULL,
  name = NULL,
  trainable = TRUE
)

References

Dauphin et al. 2017 https://arxiv.org/pdf/1612.08083.pdf


ifrit98/R2deepR documentation built on June 19, 2020, 7:45 a.m.