Man pages for macmillancontentscience/torchtransformers
Transformer Models in Torch

attention_bertBERT-Style Attention
available_bertsAvailable BERT Models
config_bertBERT Model Parameters
dataset_bertBERT Dataset
dataset_bert_pretrainedBERT Pretrained Dataset
dot-combine_segmentsCombine a pair of segments
dot-concatenate_qkv_weightsConcatenate Attention Weights
dot-default_tokenizerShortcut to make sure we're using wordpiece
dot-default_vocabShortcut to make sure we're using wordpiece
dot-download_weightsDownload and Cache Weights
dot-error_on_tokenizer_mismatchError Helper Function for Mismatches
dot-finalize_bert_tokensClean and Return BERT Tokens
dot-get_tokenizerLook Up Tokenizer Function
dot-get_tokenizer_nameLook Up Tokenizer Name
dot-get_token_vocabLook Up Token Vocabulary
dot-get_vocab_nameLook Up Vocabulary Name
dot-maybe_alertFilter Alerts through a Verbose Flag
dot-process_downloaded_weightsProcess Downloaded Weights
dot-rename_state_dict_variablesClean up Parameter Names
dot-standardize_bert_dataset_outcomeStandardize BERT Dataset Outcome
dot-standardize_bert_dataset_predictorsStandardize BERT Dataset Predictors
dot-tokenize_bert_singleTokenize a single vector of text
dot-validate_n_tokensMake Sure the Number of Tokens Makes Sense
dot-validate_tokenizer_metadataChoose Tokenizer Metadata
dot-validate_tokenizer_schemeMake Sure Tokenizer Schemes are Recognized
embeddings_bertCreate BERT Embeddings
increment_list_indexConvert from Python Standard to torch
luz_callback_bert_tokenizeBERT Tokenization Callback
model_bertConstruct a BERT Model
model_bert_pretrainedConstruct a Pretrained BERT Model
position_embeddingCreate Position Embeddings
proj_add_normProject, Add, and Normalize
simplify_bert_token_listSimplify Token List to Matrix
tokenize_bertPrepare Text for a BERT Model
torchtransformers-packagetorchtransformers: Transformer Models in Torch
transformer_encoder_bertTransformer Stack
transformer_encoder_single_bertSingle Transformer Layer
macmillancontentscience/torchtransformers documentation built on Aug. 6, 2023, 5:35 a.m.