position_embedding: Create Position Embeddings

position_embeddingR Documentation

Create Position Embeddings

Description

Position embeddings are how BERT-like language models represent the order of input tokens. Each token gets a position embedding vector which is completely determined by its position index. Because these embeddings don't depend on the actual input, it is implemented by simply initializing a matrix of weights.

Usage

position_embedding(embedding_size, max_position_embeddings)

Arguments

embedding_size

Integer; the dimension of the embedding vectors.

max_position_embeddings

Integer; maximum number of tokens in each input sequence.

Shape

Inputs:

No input tensors. Optional input parameter to limit number of positions (tokens) considered.

Output:

  • (*, max_position_embeddings, embedding_size)

Examples

emb_size <- 3L
mpe <- 2L
model <- position_embedding(
  embedding_size = emb_size,
  max_position_embeddings = mpe
)
model(seq_len_cap = 1)
model()

macmillancontentscience/torchtransformers documentation built on Aug. 6, 2023, 5:35 a.m.