Description Usage Arguments Functions Note See Also Examples
Define architecture.
1 2 3 4 5 6 7 |
brain |
An object of class |
layers |
A single integer or a vector of integer specifying the size of each layer. |
input, pool, output |
The size as an integer. |
connections |
Number of random connections. |
gates |
Number of random gates amongst the |
perceptron
: This architecture allows you to create multilayer perceptrons, also known as feed-forward neural networks. They consist of a sequence of layers, each fully connected to the next one.
lstm
: The long short-term memory is an architecture well-suited to learn from experience to classify, process and predict time series when there are very long time lags of unknown size between important events.
liquid
: The Liquid architecture allows you to create Liquid State Machines. In these networks, neurons are randomly connected to each other. The recurrent nature of the connections turns the time varying input into a spatio-temporal pattern of activations in the network nodes.
hopfield
: The Hopfield architecture serves as content-addressable memory. They are trained to remember patterns and then when feeding new patterns to the network it returns the most similar one from the patterns it was trained to remember.
Hopfield is trained and ran with different functions, see hopfield
.
squash
to set activation functions.
1 2 | brain() %>%
perceptron(c(2,3,1))
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.