nn_rrelu | R Documentation |
Applies the randomized leaky rectified liner unit function, element-wise, as described in the paper:
nn_rrelu(lower = 1/8, upper = 1/3, inplace = FALSE)
lower |
lower bound of the uniform distribution. Default: |
upper |
upper bound of the uniform distribution. Default: |
inplace |
can optionally do the operation in-place. Default: |
Empirical Evaluation of Rectified Activations in Convolutional Network
.
The function is defined as:
\mbox{RReLU}(x) =
\left\{ \begin{array}{ll}
x & \mbox{if } x \geq 0 \\
ax & \mbox{ otherwise }
\end{array}
\right.
where a
is randomly sampled from uniform distribution
\mathcal{U}(\mbox{lower}, \mbox{upper})
.
See: https://arxiv.org/pdf/1505.00853.pdf
Input: (N, *)
where *
means, any number of additional
dimensions
Output: (N, *)
, same shape as the input
if (torch_is_installed()) {
m <- nn_rrelu(0.1, 0.3)
input <- torch_randn(2)
m(input)
}
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.