BertModel: Construct object of class BertModel

View source: R/modeling.R

BertModelR Documentation

Construct object of class BertModel

Description

An object of class BertModel has several elements:

embedding_output

float Tensor of shape [batch_size, seq_length, hidden_size] corresponding to the output of the embedding layer, after summing the word embeddings with the positional embeddings and the token type embeddings, then performing layer normalization. This is the input to the transformer.

embedding_table

The table for the token embeddings.

all_encoder_layers

A list of float Tensors of shape [batch_size, seq_length, hidden_size], corresponding to all the hidden transformer layers.

sequence_output

float Tensor of shape [batch_size, seq_length, hidden_size] corresponding to the final hidden layer of the transformer encoder.

pooled_output

The dense layer on top of the hidden layer for the first token.

Usage

BertModel(
  config,
  is_training,
  input_ids,
  input_mask = NULL,
  token_type_ids = NULL,
  scope = NULL
)

Arguments

config

BertConfig instance.

is_training

Logical; TRUE for training model, FALSE for eval model. Controls whether dropout will be applied.

input_ids

Int32 Tensor of shape [batch_size, seq_length].

input_mask

(optional) Int32 Tensor of shape [batch_size, seq_length].

token_type_ids

(optional) Int32 Tensor of shape [batch_size, seq_length].

scope

(optional) Character; name for variable scope. Defaults to "bert".

Value

An object of class BertModel.

Examples

## Not run: 
with(tensorflow::tf$variable_scope("examples",
  reuse = tensorflow::tf$AUTO_REUSE
), {
  input_ids <- tensorflow::tf$constant(list(
    list(31L, 51L, 99L),
    list(15L, 5L, 0L)
  ))

  input_mask <- tensorflow::tf$constant(list(
    list(1L, 1L, 1L),
    list(1L, 1L, 0L)
  ))
  token_type_ids <- tensorflow::tf$constant(list(
    list(0L, 0L, 1L),
    list(0L, 2L, 0L)
  ))
  config <- BertConfig(
    vocab_size = 32000L,
    hidden_size = 768L,
    num_hidden_layers = 8L,
    num_attention_heads = 12L,
    intermediate_size = 1024L
  )
  model <- BertModel(
    config = config,
    is_training = TRUE,
    input_ids = input_ids,
    input_mask = input_mask,
    token_type_ids = token_type_ids
  )
})

## End(Not run)

jonathanbratt/RBERT documentation built on Jan. 26, 2023, 4:15 p.m.