create_attention_mask_from_input_mask | R Documentation |
An attention mask is used to zero out specific elements of an attention matrix. (For example, to prevent the model from "paying attention to the answer" in certain training tasks.)
create_attention_mask_from_input_mask(from_tensor, to_mask)
from_tensor |
2D or 3D Tensor of shape [batch_size, from_seq_length, ...]. |
to_mask |
int32 Tensor of shape [batch_size, to_seq_length]. |
float Tensor of shape [batch_size, from_seq_length, to_seq_length].
## Not run: with(tensorflow::tf$variable_scope("examples", reuse = tensorflow::tf$AUTO_REUSE ), { from_tensor <- ids <- tensorflow::tf$get_variable("ften", dtype = "float", shape = c(10, 20) ) to_mask <- ids <- tensorflow::tf$get_variable("mask", dtype = "int32", shape = c(10, 30) ) }) create_attention_mask_from_input_mask(from_tensor, to_mask) ## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.