time_distributed: Apply a layer to every temporal slice of an input.

Description Usage Arguments Details See Also

View source: R/layer-wrappers.R

Description

The input should be at least 3D, and the dimension of index one will be considered to be the temporal dimension.

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
time_distributed(
  object,
  layer,
  input_shape = NULL,
  batch_input_shape = NULL,
  batch_size = NULL,
  dtype = NULL,
  name = NULL,
  trainable = NULL,
  weights = NULL
)

Arguments

object

Model or layer object

layer

A layer instance.

input_shape

Dimensionality of the input (integer) not including the samples axis. This argument is required when using this layer as the first layer in a model.

batch_input_shape

Shapes, including the batch size. For instance, batch_input_shape=c(10, 32) indicates that the expected input will be batches of 10 32-dimensional vectors. batch_input_shape=list(NULL, 32) indicates batches of an arbitrary number of 32-dimensional vectors.

batch_size

Fixed batch size for layer

dtype

The data type expected by the input, as a string (float32, float64, int32...)

name

An optional name string for the layer. Should be unique in a model (do not reuse the same name twice). It will be autogenerated if it isn't provided.

trainable

Whether the layer weights will be updated during training.

weights

Initial weights for layer.

Details

Consider a batch of 32 samples, where each sample is a sequence of 10 vectors of 16 dimensions. The batch input shape of the layer is then (32, 10, 16), and the input_shape, not including the samples dimension, is (10, 16). You can then use time_distributed to apply a layer_dense to each of the 10 timesteps, independently.

See Also

Other layer wrappers: bidirectional()


dfalbel/keras documentation built on Nov. 27, 2019, 8:16 p.m.