# nn_softmax: Softmax module In torch: Tensors and Neural Networks with 'GPU' Acceleration

 nn_softmax R Documentation

## Softmax module

### Description

Applies the Softmax function to an n-dimensional input Tensor rescaling them so that the elements of the n-dimensional output Tensor lie in the range [0,1] and sum to 1. Softmax is defined as:

### Usage

nn_softmax(dim)


### Arguments

 dim (int): A dimension along which Softmax will be computed (so every slice along dim will sum to 1).

### Details

\mbox{Softmax}(x_{i}) = \frac{\exp(x_i)}{∑_j \exp(x_j)}

When the input Tensor is a sparse tensor then the unspecifed values are treated as -Inf.

### Value

: a Tensor of the same dimension and shape as the input with values in the range [0, 1]

### Shape

• Input: (*) where * means, any number of additional dimensions

• Output: (*), same shape as the input

### Note

This module doesn't work directly with NLLLoss, which expects the Log to be computed between the Softmax and itself. Use LogSoftmax instead (it's faster and has better numerical properties).

### Examples

if (torch_is_installed()) {
m <- nn_softmax(1)
input <- torch_randn(2, 3)
output <- m(input)
}


torch documentation built on Jan. 24, 2023, 1:05 a.m.