tokenize | R Documentation |
Turns input into a character vector. Usually the tokenization is done purely in C++, and never exposed to R (because that requires a copy). This function is useful for testing, or when a file doesn't parse correctly and you want to see the underlying tokens.
tokenize(file, tokenizer = tokenizer_csv(), skip = 0, n_max = -1L)
file |
Either a path to a file, a connection, or literal data (either a single string or a raw vector). Files ending in Literal data is most useful for examples and tests. To be recognised as
literal data, the input must be either wrapped with Using a value of |
tokenizer |
A tokenizer specification. |
skip |
Number of lines to skip before reading data. |
n_max |
Optionally, maximum number of rows to tokenize. |
tokenize("1,2\n3,4,5\n\n6")
# Only tokenize first two lines
tokenize("1,2\n3,4,5\n\n6", n = 2)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.