k_elu: Exponential linear unit.

View source: R/backend.R

k_eluR Documentation

Exponential linear unit.

Description

Exponential linear unit.

Usage

k_elu(x, alpha = 1)

Arguments

x

A tensor or variable to compute the activation function for.

alpha

A scalar, slope of negative section.

Value

A tensor.

Keras Backend

This function is part of a set of Keras backend functions that enable lower level access to the core operations of the backend tensor engine (e.g. TensorFlow, CNTK, Theano, etc.).

You can see a list of all available backend functions here: https://tensorflow.rstudio.com/reference/keras/index.html#backend.


keras documentation built on May 29, 2024, 3:20 a.m.