LayerWrapper: Layer wrappers

Description Usage Arguments Author(s) References See Also Examples

Description

Apply a layer to every temporal slice of an input or to bi-directional RNN.

Usage

1
2
3
TimeDistributed(layer)

Bidirectional(layer, merge_mode = "concat")

Arguments

layer

a layer instance (must be a recurrent layer for the bi-directional case)

merge_mode

Mode by which outputs of the forward and backward RNNs will be combined. One of 'sum', 'mul', 'concat', 'ave', None. If None, the outputs will not be combined, they will be returned as a list.

Author(s)

Taylor B. Arnold, taylor.arnold@acm.org

References

Chollet, Francois. 2015. Keras: Deep Learning library for Theano and TensorFlow.

See Also

Other layers: Activation, ActivityRegularization, AdvancedActivation, BatchNormalization, Conv, Dense, Dropout, Embedding, Flatten, GaussianNoise, LocallyConnected, Masking, MaxPooling, Permute, RNN, RepeatVector, Reshape, Sequential

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
if(keras_available()) {
  X_train <- matrix(sample(0:19, 100 * 100, TRUE), ncol = 100)
  Y_train <- rnorm(100)
  
  mod <- Sequential()
  mod$add(Embedding(input_dim = 20, output_dim = 10,
                    input_length = 100))
  mod$add(Dropout(0.5))
  
  mod$add(Bidirectional(LSTM(16)))
  mod$add(Dense(1))
  mod$add(Activation("sigmoid"))
  
  keras_compile(mod, loss = "mse", optimizer = RMSprop())
  keras_fit(mod, X_train, Y_train, epochs = 3, verbose = 0)
}

YTLogos/kerasR documentation built on May 19, 2019, 4:04 p.m.