tests/testthat/test-Tokenizer.R

context("Tokenizers")

test_that("scan_tokenizer works with character vectors", {
  tokens <-
      c("a", "character", "vector", "consisting", "of", "multiple", "elements")
  expect_equal(scan_tokenizer(c(paste0(tokens[1:3], collapse = " "),
                                paste0(tokens[4:5], collapse = " "),
                                paste0(tokens[6:7], collapse = " "))), tokens)
})

Try the tm package in your browser

Any scripts or data that you put into this service are public.

tm documentation built on Feb. 16, 2023, 9:40 p.m.