keras_deep_lstm2: keras deep lstm 2

Description Usage Arguments Details Value

Description

Word embedding + (bidirectional) long short-term memory + Deep dense layer

Usage

1
2
3
4
keras_deep_lstm2(input_dim, embed_dim = 128, seq_len = 50,
  lstm_dim = 32, lstm_drop = 0.2, bidirectional = F,
  hidden_dims = c(32, 32, 32), output_dim = 2,
  output_fun = "softmax")

Arguments

input_dim

Number of unique vocabluary/tokens

embed_dim

Number of word vectors

seq_len

Length of the input sequences

lstm_dim

Number of lstm neurons (default 32)

lstm_drop

default is 2

bidirectional

default is F

hidden_dims

Number of neurons per layer as vector of integers

output_dim

Number of neurons of the output layer

output_fun

Output activation function

Details

Taken from https://www.kaggle.com/gidutz/text2score-keras-rnn-word-embedding

Value

keras model


systats/textlearnR documentation built on May 6, 2019, 8:31 p.m.