Description Usage Arguments Details Value Note
Gaussian Error Linear Unit
1 | layer_activation_gelu(object, approximate = TRUE, ...)
|
object |
Model or layer object |
approximate |
(bool) Whether to apply approximation |
... |
additional parameters to pass |
A smoother version of ReLU generally used in the BERT or BERT architecture based models. Original paper: https://arxiv.org/abs/1606.08415
A tensor
Input shape: Arbitrary. Use the keyword argument 'input_shape' (tuple of integers, d oes not include the samples axis) when using this layer as the first layer in a model.
Output shape: Same shape as the input.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.