LayerWrapper: Layer wrappers

LayerWrapperR Documentation

Layer wrappers

Description

Apply a layer to every temporal slice of an input or to bi-directional RNN.

Usage

TimeDistributed(layer)

Bidirectional(layer, merge_mode = "concat")

Arguments

layer

a layer instance (must be a recurrent layer for the bi-directional case)

merge_mode

Mode by which outputs of the forward and backward RNNs will be combined. One of 'sum', 'mul', 'concat', 'ave', None. If None, the outputs will not be combined, they will be returned as a list.

Author(s)

Taylor B. Arnold, taylor.arnold@acm.org

References

Chollet, Francois. 2015. Keras: Deep Learning library for Theano and TensorFlow.

See Also

Other layers: Activation, ActivityRegularization, AdvancedActivation, BatchNormalization, Conv, Dense, Dropout, Embedding, Flatten, GaussianNoise, LocallyConnected, Masking, MaxPooling, Permute, RNN, RepeatVector, Reshape, Sequential

Examples

if(keras_available()) {
  X_train <- matrix(sample(0:19, 100 * 100, TRUE), ncol = 100)
  Y_train <- rnorm(100)
  
  mod <- Sequential()
  mod$add(Embedding(input_dim = 20, output_dim = 10,
                    input_length = 100))
  mod$add(Dropout(0.5))
  
  mod$add(Bidirectional(LSTM(16)))
  mod$add(Dense(1))
  mod$add(Activation("sigmoid"))
  
  keras_compile(mod, loss = "mse", optimizer = RMSprop())
  keras_fit(mod, X_train, Y_train, epochs = 3, verbose = 0)
}

kerasR documentation built on Aug. 17, 2022, 5:06 p.m.