AdvancedActivation | R Documentation |
Advanced activation layers
LeakyReLU(alpha = 0.3, input_shape = NULL) PReLU(input_shape = NULL) ELU(alpha = 1, input_shape = NULL) ThresholdedReLU(theta = 1, input_shape = NULL)
alpha |
float >= 0. Negative slope coefficient in LeakyReLU and scale for the negative factor in ELU. |
input_shape |
only need when first layer of a model; sets the input shape of the data |
theta |
float >= 0. Threshold location of activation in ThresholdedReLU. |
Taylor B. Arnold, taylor.arnold@acm.org
Chollet, Francois. 2015. Keras: Deep Learning library for Theano and TensorFlow.
Other layers: Activation
,
ActivityRegularization
,
BatchNormalization
, Conv
,
Dense
, Dropout
,
Embedding
, Flatten
,
GaussianNoise
, LayerWrapper
,
LocallyConnected
, Masking
,
MaxPooling
, Permute
,
RNN
, RepeatVector
,
Reshape
, Sequential
if(keras_available()) { X_train <- matrix(rnorm(100 * 10), nrow = 100) Y_train <- to_categorical(matrix(sample(0:2, 100, TRUE), ncol = 1), 3) mod <- Sequential() mod$add(Dense(units = 50, input_shape = dim(X_train)[2])) mod$add(LeakyReLU(alpha = 0.4)) mod$add(Dense(units = 50)) mod$add(ELU(alpha = 0.5)) mod$add(Dense(units = 50)) mod$add(ThresholdedReLU(theta = 1.1)) mod$add(Dense(units = 3)) mod$add(Activation("softmax")) keras_compile(mod, loss = 'categorical_crossentropy', optimizer = RMSprop()) keras_fit(mod, X_train, Y_train, batch_size = 32, epochs = 5, verbose = 0) }
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.