Softplus Transform

Share:

Description

The softplus (and inverse softplus) transform is useful to introduce positivity constraints on parameters of a function that will be optimized (e.g. MLE of the scale parameter of a density function). The softplus has been introduced to replace the exponential which might blow up for large argument. The softplus is given by : log(1+exp(x)) and converges to x for large values of x. Some care has been taken in the implementation of the softplus function to handle some numerical issues.

Usage

1
2

Arguments

x

is the value of the unconstrained parameter which is optimized

y

is the value of the positively constrained parameter

Details

Let sigma be the scale parameter of a density for which maximumm likelihood estimation will be performed. Then we can consider optimizing softplusinv(sigma) to ensure positivity of this parameter. Let sigma.unc be the optimzed unconstrained parameter, then softplus(sigma.unc) is the value of the MLE.

Value

The value of the softplus (or inverse softplus) transform.

Author(s)

Julie Carreau

References

Dugas, C., Bengio, Y., Belisle, F., Nadeau, C. and Garcia, R. (2001), A universal approximator of convex functions applied to option pricing, 13, Advances in Neural Information Processing Systems

Want to suggest features or report bugs for rdrr.io? Use the GitHub issue tracker.