View source: R/layers-recurrent-cells.R
layer_gru_cell | R Documentation |
Cell class for the GRU layer
layer_gru_cell(
units,
activation = "tanh",
recurrent_activation = "sigmoid",
use_bias = TRUE,
kernel_initializer = "glorot_uniform",
recurrent_initializer = "orthogonal",
bias_initializer = "zeros",
kernel_regularizer = NULL,
recurrent_regularizer = NULL,
bias_regularizer = NULL,
kernel_constraint = NULL,
recurrent_constraint = NULL,
bias_constraint = NULL,
dropout = 0,
recurrent_dropout = 0,
reset_after = TRUE,
...
)
units |
Positive integer, dimensionality of the output space. |
activation |
Activation function to use. Default: hyperbolic tangent
( |
recurrent_activation |
Activation function to use for the recurrent step.
Default: sigmoid ( |
use_bias |
Boolean, (default |
kernel_initializer |
Initializer for the |
recurrent_initializer |
Initializer for the |
bias_initializer |
Initializer for the bias vector. Default: |
kernel_regularizer |
Regularizer function applied to the |
recurrent_regularizer |
Regularizer function applied to the
|
bias_regularizer |
Regularizer function applied to the bias vector. Default:
|
kernel_constraint |
Constraint function applied to the |
recurrent_constraint |
Constraint function applied to the |
bias_constraint |
Constraint function applied to the bias vector. Default:
|
dropout |
Float between 0 and 1. Fraction of the units to drop for the linear transformation of the inputs. Default: 0. |
recurrent_dropout |
Float between 0 and 1. Fraction of the units to drop for the linear transformation of the recurrent state. Default: 0. |
reset_after |
GRU convention (whether to apply reset gate after or before matrix multiplication). FALSE = "before", TRUE = "after" (default and CuDNN compatible). |
... |
standard layer arguments. |
See the Keras RNN API guide for details about the usage of RNN API.
This class processes one step within the whole time sequence input, whereas
tf.keras.layer.GRU
processes the whole sequence.
For example:
inputs <- k_random_uniform(c(32, 10, 8)) output <- inputs %>% layer_rnn(layer_gru_cell(4)) output$shape # TensorShape([32, 4]) rnn <- layer_rnn(cell = layer_gru_cell(4), return_sequence = TRUE, return_state = TRUE) c(whole_sequence_output, final_state) %<-% rnn(inputs) whole_sequence_output$shape # TensorShape([32, 10, 4]) final_state$shape # TensorShape([32, 4])
Other RNN cell layers:
layer_lstm_cell()
,
layer_simple_rnn_cell()
,
layer_stacked_rnn_cells()
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.