Description Usage Arguments Value References See Also Examples
View source: R/darchUnitFunctions.R
The function calculates the activation of the units and returns a list, in which the first entry is the exponential linear activation of the units and the second entry is the derivative of the transfer function.
1 2 | exponentialLinearUnit(input, alpha = getParameter(".darch.elu.alpha", 1, ...),
...)
|
input |
Input for the activation function. |
alpha |
ELU hyperparameter. |
... |
Additional parameters. |
A list with the ELU activation in the first entry and the derivative of the activation in the second entry.
Clevert, Djork-Arne, Thomas Unterthiner, and Sepp Hochreiter (2015). "Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)". In: CoRR abs/1511.07289. URL : http://arxiv.org/abs/1511.07289
Other darch unit functions: linearUnit
,
maxoutUnit
,
rectifiedLinearUnit
,
sigmoidUnit
, softmaxUnit
,
softplusUnit
, tanhUnit
1 2 3 4 5 6 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.