| activation_log_softmax | R Documentation | 
Each input vector is handled independently.
The axis argument sets which axis of the input the function
is applied along.
activation_log_softmax(x, axis = -1L)
| x | Input tensor. | 
| axis | Integer, axis along which the softmax is applied. | 
A tensor, the result from applying the activation to the input tensor x.
Other activations: 
activation_celu() 
activation_elu() 
activation_exponential() 
activation_gelu() 
activation_glu() 
activation_hard_shrink() 
activation_hard_sigmoid() 
activation_hard_tanh() 
activation_leaky_relu() 
activation_linear() 
activation_log_sigmoid() 
activation_mish() 
activation_relu() 
activation_relu6() 
activation_selu() 
activation_sigmoid() 
activation_silu() 
activation_soft_shrink() 
activation_softmax() 
activation_softplus() 
activation_softsign() 
activation_sparse_plus() 
activation_sparsemax() 
activation_squareplus() 
activation_tanh() 
activation_tanh_shrink() 
activation_threshold() 
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.