layer_activation_gelu: Gaussian Error Linear Unit

View source: R/layers.R

layer_activation_geluR Documentation

Gaussian Error Linear Unit

Description

Gaussian Error Linear Unit

Usage

layer_activation_gelu(object, approximate = TRUE, ...)

Arguments

object

Model or layer object

approximate

(bool) Whether to apply approximation

...

additional parameters to pass

Details

A smoother version of ReLU generally used in the BERT or BERT architecture based models. Original paper: https://arxiv.org/abs/1606.08415

Value

A tensor

Note

Input shape: Arbitrary. Use the keyword argument 'input_shape' (tuple of integers, d oes not include the samples axis) when using this layer as the first layer in a model.

Output shape: Same shape as the input.


henry090/tfaddons documentation built on April 7, 2022, 11:27 p.m.