Dense | R Documentation |
Dense implements the operation:
output = activation(dot(input, kernel) + bias)
where activation
is the element-wise activation function
passed as the activation
argument, kernel
is a weights matrix
created by the layer, and bias
is a bias vector created by the layer
(only applicable if use_bias
is True
).
Note: if the input to the layer has a rank greater than 2, then
it is flattened prior to the initial dot product with kernel
.
Dense(units, activation = "linear", use_bias = TRUE, kernel_initializer = "glorot_uniform", bias_initializer = "zeros", kernel_regularizer = NULL, bias_regularizer = NULL, activity_regularizer = NULL, kernel_constraint = NULL, bias_constraint = NULL, input_shape = NULL)
units |
Positive integer, dimensionality of the output space. |
activation |
The activation function to use. |
use_bias |
Boolean, whether the layer uses a bias vector. |
kernel_initializer |
Initializer for the |
bias_initializer |
Initializer for the bias vector |
kernel_regularizer |
Regularizer function applied to the
|
bias_regularizer |
Regularizer function applied to the bias vector |
activity_regularizer |
Regularizer function applied to the output of the layer (its "activation"). |
kernel_constraint |
Constraint function applied to the |
bias_constraint |
Constraint function applied to the bias vector |
input_shape |
only need when first layer of a model; sets the input shape of the data |
Taylor B. Arnold, taylor.arnold@acm.org
Chollet, Francois. 2015. Keras: Deep Learning library for Theano and TensorFlow.
Other layers: Activation
,
ActivityRegularization
,
AdvancedActivation
,
BatchNormalization
, Conv
,
Dropout
, Embedding
,
Flatten
, GaussianNoise
,
LayerWrapper
,
LocallyConnected
, Masking
,
MaxPooling
, Permute
,
RNN
, RepeatVector
,
Reshape
, Sequential
if(keras_available()) { X_train <- matrix(rnorm(100 * 10), nrow = 100) Y_train <- to_categorical(matrix(sample(0:2, 100, TRUE), ncol = 1), 3) mod <- Sequential() mod$add(Dense(units = 50, input_shape = dim(X_train)[2])) mod$add( Dropout(rate = 0.5)) mod$add(Activation("relu")) mod$add(Dense(units = 3)) mod$add(ActivityRegularization(l1 = 1)) mod$add(Activation("softmax")) keras_compile(mod, loss = 'categorical_crossentropy', optimizer = RMSprop()) keras_fit(mod, X_train, Y_train, batch_size = 32, epochs = 5, verbose = 0, validation_split = 0.2) # You can also add layers directly as arguments to Sequential() mod <- Sequential( Dense(units = 50, input_shape = ncol(X_train)), Dropout(rate = 0.5), Activation("relu"), Dense(units = 3), ActivityRegularization(l1 = 1), Activation("softmax") ) keras_compile(mod, loss = 'categorical_crossentropy', optimizer = RMSprop()) keras_fit(mod, X_train, Y_train, batch_size = 32, epochs = 5, verbose = 0, validation_split = 0.2) } if(keras_available()) { X_train <- matrix(rnorm(100 * 10), nrow = 100) Y_train <- to_categorical(matrix(sample(0:2, 100, TRUE), ncol = 1), 3) mod <- Sequential() mod$add(Dense(units = 50, input_shape = dim(X_train)[2])) mod$add( Dropout(rate = 0.5)) mod$add(Activation("relu")) mod$add(Dense(units = 3)) mod$add(ActivityRegularization(l1 = 1)) mod$add(Activation("softmax")) keras_compile(mod, loss = 'categorical_crossentropy', optimizer = RMSprop()) keras_fit(mod, X_train, Y_train, batch_size = 32, epochs = 5, verbose = 0, validation_split = 0.2) # You can also add layers directly as arguments to Sequential() mod <- Sequential( Dense(units = 50, input_shape = ncol(X_train)), Dropout(rate = 0.5), Activation("relu"), Dense(units = 3), ActivityRegularization(l1 = 1), Activation("softmax") ) keras_compile(mod, loss = 'categorical_crossentropy', optimizer = RMSprop()) keras_fit(mod, X_train, Y_train, batch_size = 32, epochs = 5, verbose = 0, validation_split = 0.2) }
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.