tokenize_files: Tokenize_files

Description Usage Arguments Value

View source: R/text_core.R

Description

Tokenize text 'files' in parallel using 'n_workers'

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
tokenize_files(
  files,
  path,
  output_dir,
  output_names = NULL,
  n_workers = 6,
  rules = NULL,
  tok = NULL,
  encoding = "utf8",
  skip_if_exists = FALSE
)

Arguments

files

files

path

path

output_dir

output_dir

output_names

output_names

n_workers

n_workers

rules

rules

tok

tokenizer

encoding

encoding

skip_if_exists

skip_if_exists

Value

None


fastai documentation built on July 28, 2021, 5:06 p.m.