View source: R/attentionUtilities.R
layer_attention_augmented_convolution_block_2d | R Documentation |
Creates a 2-D attention augmented convolutional layer as described in the paper
layer_attention_augmented_convolution_block_2d(
inputLayer,
numberOfOutputFilters,
kernelSize = c(3, 3),
strides = c(1, 1),
depthOfQueries = 0.2,
depthOfValues = 0.2,
numberOfAttentionHeads = 8,
useRelativeEncodings = TRUE
)
inputLayer |
input keras layer. |
numberOfOutputFilters |
number of output filters. |
kernelSize |
convolution kernel size. |
strides |
convolution strides. |
depthOfQueries |
Defines the number of filters for the queries or |
depthOfValues |
Defines the number of filters for the values or |
numberOfAttentionHeads |
number of attention heads. Note that
|
useRelativeEncodings |
boolean for whether to use relative encodings (default = TRUE). |
https://arxiv.org/abs/1904.09925
with the implementation ported from the following repository
https://github.com/titu1994/keras-attention-augmented-convs
a keras tensor
Tustison NJ
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.