layer_attention_2d: Attention layer (2-D)

View source: R/attentionUtilities.R

layer_attention_2dR Documentation

Attention layer (2-D)

Description

Wraps the AttentionLayer2D taken from the following python implementation

Usage

layer_attention_2d(
  object,
  numberOfChannels,
  doGoogleBrainVersion = TRUE,
  trainable = TRUE
)

Arguments

object

Object to compose layer with. This is either a keras::keras_model_sequential to add the layer to or another Layer which this layer will call.

numberOfChannels

numberOfChannels

doGoogleBrainVersion

boolean. Variant described at second url.

trainable

Whether the layer weights will be updated during training.

Details

https://stackoverflow.com/questions/50819931/self-attention-gan-in-keras https://github.com/taki0112/Self-Attention-GAN-Tensorflow

based on the following paper:

https://arxiv.org/abs/1805.08318

Value

a keras layer tensor

Examples


## Not run: 
library( keras )
library( ANTsRNet )

inputShape <- c( 100, 100, 3 )
input <- layer_input( shape = inputShape )

numberOfFilters <- 64
outputs <- input %>% layer_conv_2d( filters = numberOfFilters, kernel_size = 2 )
outputs <- outputs %>% layer_attention_2d( numberOfFilters )

model <- keras_model( inputs = input, outputs = outputs )

## End(Not run)

ANTsX/ANTsRNet documentation built on April 28, 2024, 12:16 p.m.