Wraps the transformers Python module described in "HuggingFace's Transformers: State-of-the-art Natural Language Processing" by Wolf T. et al. (2020) <arXiv:1910.03771> in order to easily obtain contextualised embeddings of sentences. This work was done in order to ease the work of building predictive models using BERT-like embeddings. BERT stands for "Bidirectional Encoder Representations from Transformers" as described in Devlin J. et al. (2018) <arXiv:1810.04805>. It is a deep learning model which provides for pieces of words in a text a set of numbers called embeddings. These embedding capture the meaning of the words and depend on the context in which each word appears. The package provides an interface to BERT-like "Transformer" models such that R users can just download these pretrained models and retrieve the embeddings of text with a straightforward call to predict.
Package details |
|
---|---|
Maintainer | Jan Wijffels <jwijffels@bnosac.be> |
License | MPL-2.0 |
Version | 0.2.0 |
Package repository | View on GitHub |
Installation |
Install the latest version of this package by entering the following in R:
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.