k_elu: Exponential linear unit.

Description Usage Arguments Value Keras Backend

View source: R/backend.R

Description

Exponential linear unit.

Usage

1
k_elu(x, alpha = 1)

Arguments

x

A tensor or variable to compute the activation function for.

alpha

A scalar, slope of negative section.

Value

A tensor.

Keras Backend

This function is part of a set of Keras backend functions that enable lower level access to the core operations of the backend tensor engine (e.g. TensorFlow, CNTK, Theano, etc.).

You can see a list of all available backend functions here: https://keras.rstudio.com/articles/backend.html#backend-functions.


dfalbel/keras documentation built on Nov. 27, 2019, 8:16 p.m.