position_embedding | R Documentation |
Position embeddings are how BERT-like language models represent the order of input tokens. Each token gets a position embedding vector which is completely determined by its position index. Because these embeddings don't depend on the actual input, it is implemented by simply initializing a matrix of weights.
position_embedding(embedding_size, max_position_embeddings)
embedding_size |
Integer; the dimension of the embedding vectors. |
max_position_embeddings |
Integer; maximum number of tokens in each input sequence. |
Inputs:
No input tensors. Optional input parameter to limit number of positions (tokens) considered.
Output:
(*, max_position_embeddings, embedding_size)
emb_size <- 3L
mpe <- 2L
model <- position_embedding(
embedding_size = emb_size,
max_position_embeddings = mpe
)
model(seq_len_cap = 1)
model()
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.