nn_silu: Applies the Sigmoid Linear Unit (SiLU) function,...

nn_siluR Documentation

Applies the Sigmoid Linear Unit (SiLU) function, element-wise. The SiLU function is also known as the swish function.

Description

Applies the Sigmoid Linear Unit (SiLU) function, element-wise. The SiLU function is also known as the swish function.

Usage

nn_silu(inplace = FALSE)

Arguments

inplace

can optionally do the operation in-place. Default: FALSE

Details

See Gaussian Error Linear Units (GELUs) where the SiLU (Sigmoid Linear Unit) was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning and Swish: a Self-Gated Activation Function where the SiLU was experimented with later.


torch documentation built on May 29, 2024, 9:54 a.m.