Utilizes tokenization and dictionary lookup for lemmatization of text. Lemmatization is defined as "grouping together the inflected forms of a word so they can be analysed as a single item" (wikipedia). While dictionary lookup of tokens is not a true morphological analysis, this style of lemma replacement is fast and typically still robust for many applications
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.