tokenize_files: Tokenize_files

View source: R/text_core.R

tokenize_filesR Documentation

Tokenize_files

Description

Tokenize text 'files' in parallel using 'n_workers'

Usage

tokenize_files(
  files,
  path,
  output_dir,
  output_names = NULL,
  n_workers = 6,
  rules = NULL,
  tok = NULL,
  encoding = "utf8",
  skip_if_exists = FALSE
)

Arguments

files

files

path

path

output_dir

output_dir

output_names

output_names

n_workers

n_workers

rules

rules

tok

tokenizer

encoding

encoding

skip_if_exists

skip_if_exists

Value

None


fastai documentation built on June 22, 2024, 11:15 a.m.