layer_attention_augmentation_2d: Attention augmentation layer (2-D)

View source: R/attentionUtilities.R

layer_attention_augmentation_2dR Documentation

Attention augmentation layer (2-D)

Description

Wraps the AttentionAugmentation2D layer.

Usage

layer_attention_augmentation_2d(
  object,
  depthOfQueries,
  depthOfValues,
  numberOfHeads,
  isRelative,
  trainable = TRUE
)

Arguments

object

Object to compose layer with. This is either a keras::keras_model_sequential to add the layer to, or another Layer which this layer will call.

depthOfQueries

number of filters for queries.

depthOfValues

number of filters for values.

numberOfHeads

number of attention heads to use. It is required that depthOfQueries/numberOfHeads > 0.

isRelative

whether or not to use relative encodings.

trainable

Whether the layer weights will be updated during training.

Value

a keras layer tensor


ANTsX/ANTsRNet documentation built on Nov. 21, 2024, 4:07 a.m.