| activation_silu | R Documentation | 
It is defined as: swish(x) = x * sigmoid(x).
The Swish (or Silu) activation function is a smooth, non-monotonic function that is unbounded above and bounded below.
activation_silu(x)
| x | Input tensor. | 
A tensor, the result from applying the activation to the input tensor x.
Other activations: 
activation_celu() 
activation_elu() 
activation_exponential() 
activation_gelu() 
activation_glu() 
activation_hard_shrink() 
activation_hard_sigmoid() 
activation_hard_tanh() 
activation_leaky_relu() 
activation_linear() 
activation_log_sigmoid() 
activation_log_softmax() 
activation_mish() 
activation_relu() 
activation_relu6() 
activation_selu() 
activation_sigmoid() 
activation_soft_shrink() 
activation_softmax() 
activation_softplus() 
activation_softsign() 
activation_sparse_plus() 
activation_sparsemax() 
activation_squareplus() 
activation_tanh() 
activation_tanh_shrink() 
activation_threshold() 
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.