keras_deep_bi_lstm: keras_deep_bi_lstm

Description Usage Arguments Details Value

View source: R/keras_models.R

Description

Word embedding + Deep (bidirectional) long short-term memory

Usage

1
2
3
keras_deep_bi_lstm(input_dim, embed_dim = 128, seq_len = 50,
  hidden_dims = c(128, 64, 32), output_fun = "softmax",
  output_dim = 1)

Arguments

input_dim

Number of unique vocabluary/tokens

embed_dim

Number of word vectors

seq_len

Length of the input sequences

hidden_dims

Number of neurons per layer as vector of integers c(256, 128, 64)

output_fun

Output activation function

output_dim

Number of neurons of the output layer

Details

Stacking lstm modules of different size.

Value

keras model


systats/deeplyr documentation built on Oct. 4, 2020, 7:59 p.m.