Description Usage Arguments Value References See Also Examples
View source: R/darchUnitFunctions.R
The function calculates the activation of the units and returns a list, in which the first entry is the softmax activation of the units and the second entry is the derivative of the transfer function. Softplus is a smoothed version of the rectified linear activation function.
1 | softplusUnit(input, ...)
|
input |
Input for the activation function. |
... |
Additional parameters, not used. |
A list with the softplus activation in the first entry and the derivative of the activation in the second entry.
Dugas, Charles, Yoshua Bengio, Francois Belisle, Claude Nadeau, and Rene Garcia (2001). "Incorporating Second-Order Functional Knowledge for Better Option Pricing". In: Advances in Neural Information Processing Systems, pp. 472-478.
Other darch unit functions: exponentialLinearUnit
,
linearUnit
, maxoutUnit
,
rectifiedLinearUnit
,
sigmoidUnit
, softmaxUnit
,
tanhUnit
1 2 3 4 5 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.