nn_utils_rnn_pad_packed_sequence | R Documentation |
It is an inverse operation to nn_utils_rnn_pack_padded_sequence()
.
nn_utils_rnn_pad_packed_sequence(
sequence,
batch_first = FALSE,
padding_value = 0,
total_length = NULL
)
sequence |
(PackedSequence): batch to pad |
batch_first |
(bool, optional): if |
padding_value |
(float, optional): values for padded elements. |
total_length |
(int, optional): if not |
The returned Tensor's data will be of size T x B x *
, where T
is the length
of the longest sequence and B
is the batch size. If batch_first
is TRUE
,
the data will be transposed into B x T x *
format.
Tuple of Tensor containing the padded sequence, and a Tensor
containing the list of lengths of each sequence in the batch.
Batch elements will be re-ordered as they were ordered originally when
the batch was passed to nn_utils_rnn_pack_padded_sequence()
or
nn_utils_rnn_pack_sequence()
.
total_length
is useful to implement the
pack sequence -> recurrent network -> unpack sequence
pattern in a
nn_module
wrapped in ~torch.nn.DataParallel
.
if (torch_is_installed()) {
seq <- torch_tensor(rbind(c(1, 2, 0), c(3, 0, 0), c(4, 5, 6)))
lens <- c(2, 1, 3)
packed <- nn_utils_rnn_pack_padded_sequence(seq, lens,
batch_first = TRUE,
enforce_sorted = FALSE
)
packed
nn_utils_rnn_pad_packed_sequence(packed, batch_first = TRUE)
}
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.