gelu: Gaussian Error Linear Unit

View source: R/modeling.R

geluR Documentation

Gaussian Error Linear Unit

Description

This is a smoother version of the RELU. Original paper: https://arxiv.org/abs/1606.08415

Usage

gelu(x)

Arguments

x

Float Tensor to perform activation on.

Value

'x' with the GELU activation applied.

Examples

## Not run: 
tfx <- tensorflow::tf$get_variable("none", tensorflow::shape(10L))
gelu(tfx)

## End(Not run)

jonathanbratt/RBERT documentation built on Jan. 26, 2023, 4:15 p.m.