attention_bert | BERT-Style Attention |
available_berts | Available BERT Models |
config_bert | BERT Model Parameters |
dataset_bert | BERT Dataset |
dataset_bert_pretrained | BERT Pretrained Dataset |
dot-combine_segments | Combine a pair of segments |
dot-concatenate_qkv_weights | Concatenate Attention Weights |
dot-default_tokenizer | Shortcut to make sure we're using wordpiece |
dot-default_vocab | Shortcut to make sure we're using wordpiece |
dot-download_weights | Download and Cache Weights |
dot-error_on_tokenizer_mismatch | Error Helper Function for Mismatches |
dot-finalize_bert_tokens | Clean and Return BERT Tokens |
dot-get_tokenizer | Look Up Tokenizer Function |
dot-get_tokenizer_name | Look Up Tokenizer Name |
dot-get_token_vocab | Look Up Token Vocabulary |
dot-get_vocab_name | Look Up Vocabulary Name |
dot-maybe_alert | Filter Alerts through a Verbose Flag |
dot-process_downloaded_weights | Process Downloaded Weights |
dot-rename_state_dict_variables | Clean up Parameter Names |
dot-standardize_bert_dataset_outcome | Standardize BERT Dataset Outcome |
dot-standardize_bert_dataset_predictors | Standardize BERT Dataset Predictors |
dot-tokenize_bert_single | Tokenize a single vector of text |
dot-validate_n_tokens | Make Sure the Number of Tokens Makes Sense |
dot-validate_tokenizer_metadata | Choose Tokenizer Metadata |
dot-validate_tokenizer_scheme | Make Sure Tokenizer Schemes are Recognized |
embeddings_bert | Create BERT Embeddings |
increment_list_index | Convert from Python Standard to torch |
luz_callback_bert_tokenize | BERT Tokenization Callback |
model_bert | Construct a BERT Model |
model_bert_pretrained | Construct a Pretrained BERT Model |
position_embedding | Create Position Embeddings |
proj_add_norm | Project, Add, and Normalize |
simplify_bert_token_list | Simplify Token List to Matrix |
tokenize_bert | Prepare Text for a BERT Model |
torchtransformers-package | torchtransformers: Transformer Models in Torch |
transformer_encoder_bert | Transformer Stack |
transformer_encoder_single_bert | Single Transformer Layer |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.