Description Usage Arguments Details Value
Word embedding + (bidirectional) long short-term memory + Deep dense layer
1 2 3 4 | keras_deep_lstm2(input_dim, embed_dim = 128, seq_len = 50,
lstm_dim = 32, lstm_drop = 0.2, bidirectional = F,
hidden_dims = c(32, 32, 32), output_dim = 2,
output_fun = "softmax")
|
input_dim |
Number of unique vocabluary/tokens |
embed_dim |
Number of word vectors |
seq_len |
Length of the input sequences |
lstm_dim |
Number of lstm neurons (default 32) |
lstm_drop |
default is 2 |
bidirectional |
default is F |
hidden_dims |
Number of neurons per layer as vector of integers |
output_dim |
Number of neurons of the output layer |
output_fun |
Output activation function |
Taken from https://www.kaggle.com/gidutz/text2score-keras-rnn-word-embedding
keras model
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.