layer_zero_padding_1d: Zero-padding layer for 1D input (e.g. temporal sequence).

View source: R/layers-convolutional.R

layer_zero_padding_1dR Documentation

Zero-padding layer for 1D input (e.g. temporal sequence).

Description

Zero-padding layer for 1D input (e.g. temporal sequence).

Usage

layer_zero_padding_1d(
  object,
  padding = 1L,
  batch_size = NULL,
  name = NULL,
  trainable = NULL,
  weights = NULL
)

Arguments

object

What to compose the new Layer instance with. Typically a Sequential model or a Tensor (e.g., as returned by layer_input()). The return value depends on object. If object is:

  • missing or NULL, the Layer instance is returned.

  • a Sequential model, the model with an additional layer is returned.

  • a Tensor, the output tensor from layer_instance(object) is returned.

padding

int, or list of int (length 2)

  • If int: How many zeros to add at the beginning and end of the padding dimension (axis 1).

  • If list of int (length 2): How many zeros to add at the beginning and at the end of the padding dimension (⁠(left_pad, right_pad)⁠).

batch_size

Fixed batch size for layer

name

An optional name string for the layer. Should be unique in a model (do not reuse the same name twice). It will be autogenerated if it isn't provided.

trainable

Whether the layer weights will be updated during training.

weights

Initial weights for layer.

Input shape

3D tensor with shape ⁠(batch, axis_to_pad, features)⁠

Output shape

3D tensor with shape ⁠(batch, padded_axis, features)⁠

See Also

Other convolutional layers: layer_conv_1d(), layer_conv_1d_transpose(), layer_conv_2d(), layer_conv_2d_transpose(), layer_conv_3d(), layer_conv_3d_transpose(), layer_conv_lstm_2d(), layer_cropping_1d(), layer_cropping_2d(), layer_cropping_3d(), layer_depthwise_conv_1d(), layer_depthwise_conv_2d(), layer_separable_conv_1d(), layer_separable_conv_2d(), layer_upsampling_1d(), layer_upsampling_2d(), layer_upsampling_3d(), layer_zero_padding_2d(), layer_zero_padding_3d()


keras documentation built on May 29, 2024, 3:20 a.m.