Utilizes tokenization and dictionary lookup for lemmatization of text. Lemmatization is defined as "grouping together the inflected forms of a word so they can be analysed as a single item" (wikipedia). While dictionary lookup of tokens is not a true morphological analysis, this style of lemma replacement is fast and typically still robust for many applications.
Package details |
|
---|---|
Maintainer | Tyler Rinker <tyler.rinker@gmail.com> |
License | GPL-2 |
Version | 0.0.1 |
Package repository | View on GitHub |
Installation |
Install the latest version of this package by entering the following in R:
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.