attention: Self-Attention Algorithm

Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <> "Attention and Memory in Deep Learning".

Package details

AuthorBastiaan Quast [aut, cre] (<>)
MaintainerBastiaan Quast <>
LicenseGPL (>= 3)
Package repositoryView on CRAN
Installation Install the latest version of this package by entering the following in R:

Try the attention package in your browser

Any scripts or data that you put into this service are public.

attention documentation built on Nov. 10, 2023, 9:09 a.m.