create_attention_mask_from_input_mask: Create 3D attention mask from a 2D tensor mask

View source: R/modeling.R

create_attention_mask_from_input_maskR Documentation

Create 3D attention mask from a 2D tensor mask

Description

An attention mask is used to zero out specific elements of an attention matrix. (For example, to prevent the model from "paying attention to the answer" in certain training tasks.)

Usage

create_attention_mask_from_input_mask(from_tensor, to_mask)

Arguments

from_tensor

2D or 3D Tensor of shape [batch_size, from_seq_length, ...].

to_mask

int32 Tensor of shape [batch_size, to_seq_length].

Value

float Tensor of shape [batch_size, from_seq_length, to_seq_length].

Examples

## Not run: 
with(tensorflow::tf$variable_scope("examples",
  reuse = tensorflow::tf$AUTO_REUSE
), {
  from_tensor <- ids <- tensorflow::tf$get_variable("ften",
    dtype = "float", shape = c(10, 20)
  )
  to_mask <- ids <- tensorflow::tf$get_variable("mask",
    dtype = "int32", shape = c(10, 30)
  )
})
create_attention_mask_from_input_mask(from_tensor, to_mask)

## End(Not run)

jonathanbratt/RBERT documentation built on Jan. 26, 2023, 4:15 p.m.