Description Usage Arguments Details Value Computes rrelu function
rrelu function.
1 2 3 4 5 6 7 | activation_rrelu(
x,
lower = 0.125,
upper = 0.333333333333333,
training = NULL,
seed = NULL
)
|
x |
A 'Tensor'. Must be one of the following types: 'float16', 'float32', 'float64'. |
lower |
'float', lower bound for random alpha. |
upper |
'float', upper bound for random alpha. |
training |
'bool', indicating whether the 'call' is meant for training or inference. |
seed |
'int', this sets the operation-level seed. Returns: |
Computes rrelu function: 'x if x > 0 else random(lower, upper) * x' or 'x if x > 0 else x * (lower + upper) / 2' depending on whether training is enabled. See [Empirical Evaluation of Rectified Activations in Convolutional Network](https://arxiv.org/abs/1505.00853).
A 'Tensor'. Has the same type as 'x'.
'x if x > 0 else random(lower, upper) * x' or 'x if x > 0 else x * (lower + upper) / 2' depending on whether training is enabled.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.