View source: R/layers-recurrent-cells.R
layer_lstm_cell | R Documentation |
Cell class for the LSTM layer
layer_lstm_cell(
units,
activation = "tanh",
recurrent_activation = "sigmoid",
use_bias = TRUE,
kernel_initializer = "glorot_uniform",
recurrent_initializer = "orthogonal",
bias_initializer = "zeros",
unit_forget_bias = TRUE,
kernel_regularizer = NULL,
recurrent_regularizer = NULL,
bias_regularizer = NULL,
kernel_constraint = NULL,
recurrent_constraint = NULL,
bias_constraint = NULL,
dropout = 0,
recurrent_dropout = 0,
...
)
units |
Positive integer, dimensionality of the output space. |
activation |
Activation function to use. Default: hyperbolic tangent
( |
recurrent_activation |
Activation function to use for the recurrent step.
Default: sigmoid ( |
use_bias |
Boolean, (default |
kernel_initializer |
Initializer for the |
recurrent_initializer |
Initializer for the |
bias_initializer |
Initializer for the bias vector. Default: |
unit_forget_bias |
Boolean (default |
kernel_regularizer |
Regularizer function applied to the |
recurrent_regularizer |
Regularizer function applied to
the |
bias_regularizer |
Regularizer function applied to the bias vector. Default:
|
kernel_constraint |
Constraint function applied to the |
recurrent_constraint |
Constraint function applied to the |
bias_constraint |
Constraint function applied to the bias vector. Default:
|
dropout |
Float between 0 and 1. Fraction of the units to drop for the linear transformation of the inputs. Default: 0. |
recurrent_dropout |
Float between 0 and 1. Fraction of the units to drop for the linear transformation of the recurrent state. Default: 0. |
... |
standard layer arguments. |
See the Keras RNN API guide for details about the usage of RNN API.
This class processes one step within the whole time sequence input, whereas
tf$keras$layer$LSTM
processes the whole sequence.
For example:
inputs <- k_random_normal(c(32, 10, 8)) rnn <- layer_rnn(cell = layer_lstm_cell(units = 4)) output <- rnn(inputs) dim(output) # (32, 4) rnn <- layer_rnn(cell = layer_lstm_cell(units = 4), return_sequences = TRUE, return_state = TRUE) c(whole_seq_output, final_memory_state, final_carry_state) %<-% rnn(inputs) dim(whole_seq_output) # (32, 10, 4) dim(final_memory_state) # (32, 4) dim(final_carry_state) # (32, 4)
Other RNN cell layers:
layer_gru_cell()
,
layer_simple_rnn_cell()
,
layer_stacked_rnn_cells()
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.