count_tokens: counts the frequency of unique tokens

Description Usage Arguments Value See Also Examples

View source: R/count_tokens.R

Description

Call word tokenizer, then count the frequency of each unique word.

Usage

1

Arguments

x

length 1 character vector

Value

A table of word frequencies

See Also

str_split boundary tokenize

Examples

1
count_tokens("If this function is working right, it is going to return a count of three for the word, 'is.'")

markallenthornton/affectr documentation built on May 17, 2019, 2:15 a.m.