activation_rrelu: Rrelu

Description Usage Arguments Details Value Computes rrelu function

View source: R/activations.R

Description

rrelu function.

Usage

1
2
3
4
5
6
7
activation_rrelu(
  x,
  lower = 0.125,
  upper = 0.333333333333333,
  training = NULL,
  seed = NULL
)

Arguments

x

A 'Tensor'. Must be one of the following types: 'float16', 'float32', 'float64'.

lower

'float', lower bound for random alpha.

upper

'float', upper bound for random alpha.

training

'bool', indicating whether the 'call' is meant for training or inference.

seed

'int', this sets the operation-level seed. Returns:

Details

Computes rrelu function: 'x if x > 0 else random(lower, upper) * x' or 'x if x > 0 else x * (lower + upper) / 2' depending on whether training is enabled. See [Empirical Evaluation of Rectified Activations in Convolutional Network](https://arxiv.org/abs/1505.00853).

Value

A 'Tensor'. Has the same type as 'x'.

Computes rrelu function

'x if x > 0 else random(lower, upper) * x' or 'x if x > 0 else x * (lower + upper) / 2' depending on whether training is enabled.


tfaddons documentation built on July 2, 2020, 2:12 a.m.