LocallyConnected | R Documentation |
The LocallyConnected layers works similarly to the Conv layers, except that weights are unshared, that is, a different set of filters is applied at each different patch of the input.
LocallyConnected1D(filters, kernel_size, strides = 1, padding = "valid", activation = NULL, use_bias = TRUE, kernel_initializer = "glorot_uniform", bias_initializer = "zeros", kernel_regularizer = NULL, bias_regularizer = NULL, activity_regularizer = NULL, kernel_constraint = NULL, bias_constraint = NULL, input_shape = NULL) LocallyConnected2D(filters, kernel_size, strides = c(1, 1), padding = "valid", data_format = NULL, activation = NULL, use_bias = TRUE, kernel_initializer = "glorot_uniform", bias_initializer = "zeros", kernel_regularizer = NULL, bias_regularizer = NULL, activity_regularizer = NULL, kernel_constraint = NULL, bias_constraint = NULL, input_shape = NULL)
filters |
Integer, the dimensionality of the output space (i.e. the number output of filters in the convolution). |
kernel_size |
A pair of integers specifying the dimensions of the 2D convolution window. |
strides |
A pair of integers specifying the stride length of the convolution. |
padding |
One of "valid", "causal" or "same" (case-insensitive). |
activation |
Activation function to use |
use_bias |
Boolean, whether the layer uses a bias vector. |
kernel_initializer |
Initializer for the kernel weights matrix |
bias_initializer |
Initializer for the bias vector |
kernel_regularizer |
Regularizer function applied to the kernel weights matrix |
bias_regularizer |
Regularizer function applied to the bias vector |
activity_regularizer |
Regularizer function applied to the output of the layer (its "activation"). |
kernel_constraint |
Constraint function applied to the kernel matrix |
bias_constraint |
Constraint function applied to the bias vector |
input_shape |
only need when first layer of a model; sets the input shape of the data |
data_format |
A string, one of channels_last (default) or channels_first. The ordering of the dimensions in the inputs. |
Taylor B. Arnold, taylor.arnold@acm.org
Chollet, Francois. 2015. Keras: Deep Learning library for Theano and TensorFlow.
Other layers: Activation
,
ActivityRegularization
,
AdvancedActivation
,
BatchNormalization
, Conv
,
Dense
, Dropout
,
Embedding
, Flatten
,
GaussianNoise
, LayerWrapper
,
Masking
, MaxPooling
,
Permute
, RNN
,
RepeatVector
, Reshape
,
Sequential
if(keras_available()) { X_train <- array(rnorm(100 * 28 * 28), dim = c(100, 28, 28, 1)) Y_train <- to_categorical(matrix(sample(0:2, 100, TRUE), ncol = 1), 3) mod <- Sequential() mod$add(Conv2D(filters = 2, kernel_size = c(2, 2), input_shape = c(28, 28, 1))) mod$add(Activation("relu")) mod$add(MaxPooling2D(pool_size=c(2, 2))) mod$add(LocallyConnected2D(filters = 2, kernel_size = c(2, 2))) mod$add(Activation("relu")) mod$add(MaxPooling2D(pool_size=c(2, 2))) mod$add(Dropout(0.25)) mod$add(Flatten()) mod$add(Dropout(0.5)) mod$add(Dense(3, activation='softmax')) keras_compile(mod, loss='categorical_crossentropy', optimizer=RMSprop()) keras_fit(mod, X_train, Y_train, verbose = 0) }
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.