Description Usage Arguments See Also
View source: R/layers-activations.R
Rectified Linear Unit activation function
1 2 3 4 5 6 7 8 9 10 11 12 13 |
object |
Model or layer object |
max_value |
loat, the maximum output value. |
negative_slope |
float >= 0 Negative slope coefficient. |
threshold |
float. Threshold value for thresholded activation. |
input_shape |
Input shape (list of integers, does not include the samples axis) which is required when using this layer as the first layer in a model. |
batch_input_shape |
Shapes, including the batch size. For instance,
|
batch_size |
Fixed batch size for layer |
dtype |
The data type expected by the input, as a string ( |
name |
An optional name string for the layer. Should be unique in a model (do not reuse the same name twice). It will be autogenerated if it isn't provided. |
trainable |
Whether the layer weights will be updated during training. |
weights |
Initial weights for layer. |
Other activation layers:
layer_activation_elu()
,
layer_activation_leaky_relu()
,
layer_activation_parametric_relu()
,
layer_activation_selu()
,
layer_activation_softmax()
,
layer_activation_thresholded_relu()
,
layer_activation()
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.