exponentialLinearUnit: Exponential linear unit (ELU) function with unit derivatives.

Description Usage Arguments Value References See Also Examples

View source: R/darchUnitFunctions.R

Description

The function calculates the activation of the units and returns a list, in which the first entry is the exponential linear activation of the units and the second entry is the derivative of the transfer function.

Usage

1
2
exponentialLinearUnit(input, alpha = getParameter(".darch.elu.alpha", 1, ...),
  ...)

Arguments

input

Input for the activation function.

alpha

ELU hyperparameter.

...

Additional parameters.

Value

A list with the ELU activation in the first entry and the derivative of the activation in the second entry.

References

Clevert, Djork-Arne, Thomas Unterthiner, and Sepp Hochreiter (2015). "Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)". In: CoRR abs/1511.07289. URL : http://arxiv.org/abs/1511.07289

See Also

Other darch unit functions: linearUnit, maxoutUnit, rectifiedLinearUnit, sigmoidUnit, softmaxUnit, softplusUnit, tanhUnit

Examples

1
2
3
4
5
6
## Not run: 
data(iris)
model <- darch(Species ~ ., iris, darch.unitFunction = "exponentialLinearUnit",
 darch.elu.alpha = 2)

## End(Not run)

maddin79/darch documentation built on May 21, 2019, 10:53 a.m.