| all_tokenized | Role Selection |
| count_functions | List of all feature counting functions |
| emoji_samples | Sample sentences with emojis |
| pipe | Pipe operator |
| reexports | Objects exported from other packages |
| required_pkgs.step | S3 methods for tracking which additional packages are needed... |
| show_tokens | Show token output of recipe |
| step_clean_levels | Clean Categorical Levels |
| step_clean_names | Clean Variable Names |
| step_dummy_hash | Indicator Variables via Feature Hashing |
| step_lda | Calculate LDA Dimension Estimates of Tokens |
| step_lemma | Lemmatization of Token Variables |
| step_ngram | Generate n-grams From Token Variables |
| step_pos_filter | Part of Speech Filtering of Token Variables |
| step_sequence_onehot | Positional One-Hot encoding of Tokens |
| step_stem | Stemming of Token Variables |
| step_stopwords | Filtering of Stop Words for Tokens Variables |
| step_textfeature | Calculate Set of Text Features |
| step_texthash | Feature Hashing of Tokens |
| step_text_normalization | Normalization of Character Variables |
| step_tf | Term frequency of Tokens |
| step_tfidf | Term Frequency-Inverse Document Frequency of Tokens |
| step_tokenfilter | Filter Tokens Based on Term Frequency |
| step_tokenize | Tokenization of Character Variables |
| step_tokenize_bpe | BPE Tokenization of Character Variables |
| step_tokenize_sentencepiece | Sentencepiece Tokenization of Character Variables |
| step_tokenize_wordpiece | Wordpiece Tokenization of Character Variables |
| step_tokenmerge | Combine Multiple Token Variables Into One |
| step_untokenize | Untokenization of Token Variables |
| step_word_embeddings | Pretrained Word Embeddings of Tokens |
| textrecipes-package | textrecipes: Extra 'Recipes' for Text Processing |
| tidy.recipe | Tidy the Result of a Recipe |
| tokenlist | Create Token Object |
| tunable_textrecipes | tunable methods for textrecipes |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.