Tokenizer: Tokenizer

View source: R/text_core.R

TokenizerR Documentation

Tokenizer

Description

Provides a consistent 'Transform' interface to tokenizers operating on 'DataFrame's and folders

Usage

Tokenizer(
  tok,
  rules = NULL,
  counter = NULL,
  lengths = NULL,
  mode = NULL,
  sep = " "
)

Arguments

tok

tokenizer

rules

rules

counter

counter

lengths

lengths

mode

mode

sep

separator

Value

None


fastai documentation built on March 21, 2022, 9:07 a.m.