| activation_log_sigmoid | R Documentation | 
It is defined as f(x) = log(1 / (1 + exp(-x))).
activation_log_sigmoid(x)
| x | Input tensor. | 
A tensor, the result from applying the activation to the input tensor x.
Other activations: 
activation_celu() 
activation_elu() 
activation_exponential() 
activation_gelu() 
activation_glu() 
activation_hard_shrink() 
activation_hard_sigmoid() 
activation_hard_tanh() 
activation_leaky_relu() 
activation_linear() 
activation_log_softmax() 
activation_mish() 
activation_relu() 
activation_relu6() 
activation_selu() 
activation_sigmoid() 
activation_silu() 
activation_soft_shrink() 
activation_softmax() 
activation_softplus() 
activation_softsign() 
activation_sparse_plus() 
activation_sparsemax() 
activation_squareplus() 
activation_tanh() 
activation_tanh_shrink() 
activation_threshold() 
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.