nn_gelu: GELU module

nn_geluR Documentation

GELU module

Description

Applies the Gaussian Error Linear Units function:

\mbox{GELU}(x) = x * \Phi(x)

Usage

nn_gelu(approximate = "none")

Arguments

approximate

the gelu approximation algorithm to use: 'none' or 'tanh'. Default: 'none'.

Details

where \Phi(x) is the Cumulative Distribution Function for Gaussian Distribution.

Shape

  • Input: (N, *) where * means, any number of additional dimensions

  • Output: (N, *), same shape as the input

Examples

if (torch_is_installed()) {
m <- nn_gelu()
input <- torch_randn(2)
output <- m(input)
}

torch documentation built on May 29, 2024, 9:54 a.m.