gelu: Gaussian Error Linear Unit. This is a smoother version of the...

Description Usage

View source: R/activation.R

Description

Gaussian Error Linear Unit. This is a smoother version of the RELU. Original paper: https://arxiv.org/abs/1606.08415 Args: x: float Tensor to perform activation. Returns: x with the GELU activation applied.

Usage

1
gelu(x)

ifrit98/R2deepR documentation built on June 19, 2020, 7:45 a.m.