compute_attention_component: antecedent: Tensor with shape [batch, length, channels]...

Description Usage

View source: R/attention-layers.R View source: R/attention-utils.R

Description

antecedent: Tensor with shape [batch, length, channels] depth: specifying projection layer depth filter_width: how wide should the attention component be padding: must be in: c("VALID", "SAME", "LEFT")

antecedent: Tensor with shape [batch, length, channels] depth: specifying projection layer depth filter_width: how wide should the attention component be padding: must be in: c("valid", "same", "left")

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
compute_attention_component(
  antecedent,
  depth,
  filter_width = 1L,
  padding = "valid",
  name = "c",
  vars_3d_num_heads = 0L
)

compute_attention_component(
  antecedent,
  depth,
  filter_width = 1L,
  padding = "valid",
  name = "c",
  vars_3d_num_heads = 0L
)

ifrit98/transformR documentation built on Nov. 26, 2019, 2:14 a.m.