| activation_hard_shrink | R Documentation | 
It is defined as:
hard_shrink(x) = x if |x| > threshold,
hard_shrink(x) = 0 otherwise.
activation_hard_shrink(x, threshold = 0.5)
x | 
 Input tensor.  | 
threshold | 
 Threshold value. Defaults to   | 
A tensor, the result from applying the activation to the input tensor x.
Other activations: 
activation_celu() 
activation_elu() 
activation_exponential() 
activation_gelu() 
activation_glu() 
activation_hard_sigmoid() 
activation_hard_tanh() 
activation_leaky_relu() 
activation_linear() 
activation_log_sigmoid() 
activation_log_softmax() 
activation_mish() 
activation_relu() 
activation_relu6() 
activation_selu() 
activation_sigmoid() 
activation_silu() 
activation_soft_shrink() 
activation_softmax() 
activation_softplus() 
activation_softsign() 
activation_sparse_plus() 
activation_sparsemax() 
activation_squareplus() 
activation_tanh() 
activation_tanh_shrink() 
activation_threshold() 
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.