Nothing
Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm.
The package can be installed from CRAN using:
install.packages('attention')
The development version, to be used at your peril, can be installed from GitHub using the remotes package.
if (!require('remotes')) install.packages('remotes')
remotes::install_github('bquast/attention')
Development takes place on the GitHub page.
https://github.com/bquast/attention
Bugs can be filed on the issues page on GitHub.
https://github.com/bquast/attention/issues
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.