keras_deep_lstm: keras deep lstm

Description Usage Arguments Details Value

Description

Word embedding + Deep (bidirectional) long short-term memory

Usage

1
2
3
keras_deep_lstm(input_dim, embed_dim = 128, seq_len = 50,
  hidden_dims = c(128, 64, 32), bidirectional = F,
  output_fun = "softmax", output_dim = 2)

Arguments

input_dim

Number of unique vocabluary/tokens

embed_dim

Number of word vectors

seq_len

Length of the input sequences

hidden_dims

Number of neurons per layer as vector of integers c(256, 128, 64)

bidirectional

default is F

output_fun

Output activation function

output_dim

Number of neurons of the output layer

Details

Stacking lstm modules of different size.

Value

keras model


systats/textlearnR documentation built on May 6, 2019, 8:31 p.m.