nn_embedded_lstm: Neural Network with a long-term short-term memory and an...

Description Usage Arguments

Description

This function is a wrapper for a long term short term neural network written using the Keras Package.

Usage

1
2
3
4
nn_embedded_lstm(Text, Codes, Words = 3000, Seed = 17,
  Train_prop = 0.5, Epochs = 10, Batch = 32, MaxSentencelen = 60,
  WordEmbedDim = 50, ValSplit = 0.1, Units_lstm = 64,
  Dropout = 0.2, Recurrent_dropout = 0.2, CM = TRUE, Model = FALSE)

Arguments

Text

The text that will be used as training and test data.

Codes

The codes that will be used as outcomes to be predicted by the NN model.

Words

The number of top words included in document feature matrixes used as training and testing data.

Seed

The seed used in the model. Defaults to 17

Train_prop

The proportion of the data used to train the model. The remainder is used as test data.

Epochs

The number of epochs used in the NN model.

Batch

The number of batches estimated in the NN.

MaxSentencelen

All sentences will be truncated to this length to be input into the LSTM model

WordEmbedDim

The number of word embedding dimensions to be produced by the LSTM model

ValSplit

The validation split of the data used in the training of the LSTM model

Units_lstm

The number of network nodes used in the LSTM layer

Dropout

A floating variable bound between 0 and 1. It determines the rate at which units are dropped for the linear tranformation of the inputs.

Recurrent_dropout

A floating variable bound between 0 and 1. It determines the fraction of the units to drop for the linear transformation of the recurrent neural network layer.

CM

A logical variable that indicates whether a confusion matrix will be output from the function

Model

A logical variable that indicates whether the trained model should be included in the output of this function


pchest/simpleNN documentation built on May 14, 2019, 8:50 p.m.