Tokenizer: Tokenizer

View source: R/text_core.R

TokenizerR Documentation

Tokenizer

Description

Provides a consistent 'Transform' interface to tokenizers operating on 'DataFrame's and folders

Usage

Tokenizer(
  tok,
  rules = NULL,
  counter = NULL,
  lengths = NULL,
  mode = NULL,
  sep = " "
)

Arguments

tok

tokenizer

rules

rules

counter

counter

lengths

lengths

mode

mode

sep

separator

Value

None


EagerAI/fastai documentation built on April 16, 2024, 12:01 p.m.