Gaussian Error Linear Unit. This is a smoother version of the RELU. Original paper: https://arxiv.org/abs/1606.08415 Args: x: float Tensor to perform activation. Returns: x with the GELU activation applied.
1 | gelu(x)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.