attention: Self-Attention Algorithm

Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning".

Package details

AuthorBastiaan Quast [aut, cre] (<https://orcid.org/0000-0002-2951-3577>)
MaintainerBastiaan Quast <bquast@gmail.com>
LicenseGPL (>= 3)
Version0.4.0
Package repositoryView on CRAN
Installation Install the latest version of this package by entering the following in R:
install.packages("attention")

Try the attention package in your browser

Any scripts or data that you put into this service are public.

attention documentation built on Nov. 10, 2023, 9:09 a.m.