View source: R/custom_layers.R
layer_pos_embedding_wrapper | R Documentation |
Positional encoding layer with learned embedding.
layer_pos_embedding_wrapper(
maxlen = 100,
vocabulary_size = 4,
load_r6 = FALSE,
embed_dim = 64
)
maxlen |
Length of predictor sequence. |
vocabulary_size |
Number of unique character in vocabulary. |
load_r6 |
Whether to load the R6 layer class. |
embed_dim |
Dimension for token embedding. No embedding if set to 0. Should be used when input is not one-hot encoded (integer sequence). |
A keras layer implementing positional embedding.
library(keras)
l <- layer_pos_embedding_wrapper()
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.