layer_attention_augmented_convolution_block_2d: Creates a 2-D attention augmented convolutional block

View source: R/attentionUtilities.R

layer_attention_augmented_convolution_block_2dR Documentation

Creates a 2-D attention augmented convolutional block

Description

Creates a 2-D attention augmented convolutional layer as described in the paper

Usage

layer_attention_augmented_convolution_block_2d(
  inputLayer,
  numberOfOutputFilters,
  kernelSize = c(3, 3),
  strides = c(1, 1),
  depthOfQueries = 0.2,
  depthOfValues = 0.2,
  numberOfAttentionHeads = 8,
  useRelativeEncodings = TRUE
)

Arguments

inputLayer

input keras layer.

numberOfOutputFilters

number of output filters.

kernelSize

convolution kernel size.

strides

convolution strides.

depthOfQueries

Defines the number of filters for the queries or k. Either absolute or, if < 1.0, number of k filters = depthOfQueries * numberOfOutputFilters.

depthOfValues

Defines the number of filters for the values or v. Either absolute or, if < 1.0, number of v filters = depthOfValues * numberOfOutputFilters.

numberOfAttentionHeads

number of attention heads. Note that as.integer(kDepth/numberOfAttentionHeads)>0 (default = 8).

useRelativeEncodings

boolean for whether to use relative encodings (default = TRUE).

Details

https://arxiv.org/abs/1904.09925

with the implementation ported from the following repository

https://github.com/titu1994/keras-attention-augmented-convs

Value

a keras tensor

Author(s)

Tustison NJ


ANTsX/ANTsRNet documentation built on Nov. 21, 2024, 4:07 a.m.