gelu: Gaussian Error Linear Unit. This is a smoother version of the...

Description Usage

View source: R/activation.R

Description

Gaussian Error Linear Unit. This is a smoother version of the RELU. Original paper: https://arxiv.org/abs/1606.08415 Args: x: float Tensor to perform activation. Returns: x with the GELU activation applied.

Usage

1
gelu(x)

ifrit98/layerR documentation built on March 2, 2020, 8:11 a.m.