layer_activation_gelu: keras lambda layer Gaussian Error Linear Unit. This is a...

Description Usage

View source: R/activation.R

Description

keras lambda layer Gaussian Error Linear Unit. This is a smoother version of the RELU. Original paper: https://arxiv.org/abs/1606.08415 Args: x: float Tensor to perform activation. Returns: x with the GELU activation applied.

Usage

1
layer_activation_gelu(object, name = "gelu")

ifrit98/R2deepR documentation built on June 19, 2020, 7:45 a.m.