Description Usage Arguments Author(s) References See Also Examples
Dense implements the operation:
output = activation(dot(input, kernel) + bias)
where activation is the element-wise activation function
passed as the activation argument, kernel is a weights matrix
created by the layer, and bias is a bias vector created by the layer
(only applicable if use_bias is True).
Note: if the input to the layer has a rank greater than 2, then
it is flattened prior to the initial dot product with kernel.
1 2 3 4 5 |
units |
Positive integer, dimensionality of the output space. |
activation |
The activation function to use. |
use_bias |
Boolean, whether the layer uses a bias vector. |
kernel_initializer |
Initializer for the |
bias_initializer |
Initializer for the bias vector |
kernel_regularizer |
Regularizer function applied to the
|
bias_regularizer |
Regularizer function applied to the bias vector |
activity_regularizer |
Regularizer function applied to the output of the layer (its "activation"). |
kernel_constraint |
Constraint function applied to the |
bias_constraint |
Constraint function applied to the bias vector |
input_shape |
only need when first layer of a model; sets the input shape of the data |
Taylor B. Arnold, taylor.arnold@acm.org
Chollet, Francois. 2015. Keras: Deep Learning library for Theano and TensorFlow.
Other layers: Activation,
ActivityRegularization,
AdvancedActivation,
BatchNormalization, Conv,
Dropout, Embedding,
Flatten, GaussianNoise,
LayerWrapper,
LocallyConnected, Masking,
MaxPooling, Permute,
RNN, RepeatVector,
Reshape, Sequential
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 | if(keras_available()) {
X_train <- matrix(rnorm(100 * 10), nrow = 100)
Y_train <- to_categorical(matrix(sample(0:2, 100, TRUE), ncol = 1), 3)
mod <- Sequential()
mod$add(Dense(units = 50, input_shape = dim(X_train)[2]))
mod$add( Dropout(rate = 0.5))
mod$add(Activation("relu"))
mod$add(Dense(units = 3))
mod$add(ActivityRegularization(l1 = 1))
mod$add(Activation("softmax"))
keras_compile(mod, loss = 'categorical_crossentropy', optimizer = RMSprop())
keras_fit(mod, X_train, Y_train, batch_size = 32, epochs = 5,
verbose = 0, validation_split = 0.2)
# You can also add layers directly as arguments to Sequential()
mod <- Sequential(
Dense(units = 50, input_shape = ncol(X_train)),
Dropout(rate = 0.5),
Activation("relu"),
Dense(units = 3),
ActivityRegularization(l1 = 1),
Activation("softmax")
)
keras_compile(mod, loss = 'categorical_crossentropy', optimizer = RMSprop())
keras_fit(mod, X_train, Y_train, batch_size = 32, epochs = 5,
verbose = 0, validation_split = 0.2)
}
if(keras_available()) {
X_train <- matrix(rnorm(100 * 10), nrow = 100)
Y_train <- to_categorical(matrix(sample(0:2, 100, TRUE), ncol = 1), 3)
mod <- Sequential()
mod$add(Dense(units = 50, input_shape = dim(X_train)[2]))
mod$add( Dropout(rate = 0.5))
mod$add(Activation("relu"))
mod$add(Dense(units = 3))
mod$add(ActivityRegularization(l1 = 1))
mod$add(Activation("softmax"))
keras_compile(mod, loss = 'categorical_crossentropy', optimizer = RMSprop())
keras_fit(mod, X_train, Y_train, batch_size = 32, epochs = 5,
verbose = 0, validation_split = 0.2)
# You can also add layers directly as arguments to Sequential()
mod <- Sequential(
Dense(units = 50, input_shape = ncol(X_train)),
Dropout(rate = 0.5),
Activation("relu"),
Dense(units = 3),
ActivityRegularization(l1 = 1),
Activation("softmax")
)
keras_compile(mod, loss = 'categorical_crossentropy', optimizer = RMSprop())
keras_fit(mod, X_train, Y_train, batch_size = 32, epochs = 5,
verbose = 0, validation_split = 0.2)
}
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.