softmax: Softmax and logsoftmax functions and their inverse functions

View source: R/softmax.R

softmaxR Documentation

Softmax and logsoftmax functions and their inverse functions

Description

softmax returns the value of the softmax function softmaxinv returns the value of the inverse-softmax function logsoftmax returns the value of the logsoftmax function logsoftmaxinv returns the value of the inverse-logsoftmax function

Usage

softmax(eta, lambda = 1, gradient = FALSE, hessian = FALSE)

softmaxinv(p, lambda = 1, gradient = FALSE, hessian = FALSE)

logsoftmax(eta, lambda = 1, gradient = FALSE, hessian = FALSE)

logsoftmaxinv(l, lambda = 1, gradient = FALSE, hessian = FALSE)

Arguments

eta

A numeric vector input

lambda

Tuning parameter (a single positive value)

gradient

Logical

hessian

Logical

p

A probability vector (i.e., numeric vector of non-negative values that sum to one)

l

A log-probability vector (i.e., numeric vector of non-positive values that logsum to zero)

Details

The softmax function is a bijective function that maps a real vector with length m-1 to a probability vector with length m with all non-zero probabilities. The softmax function is useful in a wide range of probability and statistical applications. The present functions define the softmax function and its inverse, both with a tuning parameter. It also defines the log-softmax function and its inverse, both with a tuning parameter.

Value

Value of the softmax function or its inverse (or their log). If gradient or hessian is TRUE, it will be included as an attribute.

Examples

softmax(5:7)
softmaxinv(softmax(5:7))
logsoftmax(5:7)
logsoftmaxinv(logsoftmax(5:7))

utilities documentation built on July 1, 2022, 9:06 a.m.

Related to softmax in utilities...