layer_pos_embedding_wrapper: Layer for positional embedding

View source: R/custom_layers.R

layer_pos_embedding_wrapperR Documentation

Layer for positional embedding

Description

Positional encoding layer with learned embedding.

Usage

layer_pos_embedding_wrapper(
  maxlen = 100,
  vocabulary_size = 4,
  load_r6 = FALSE,
  embed_dim = 64
)

Arguments

maxlen

Length of predictor sequence.

vocabulary_size

Number of unique character in vocabulary.

load_r6

Whether to load the R6 layer class.

embed_dim

Dimension for token embedding. No embedding if set to 0. Should be used when input is not one-hot encoded (integer sequence).

Value

A keras layer implementing positional embedding.

Examples



library(keras)
l <- layer_pos_embedding_wrapper()


GenomeNet/deepG documentation built on Dec. 24, 2024, 12:11 p.m.