AdvancedActivation: Advanced activation layers

AdvancedActivationR Documentation

Advanced activation layers

Description

Advanced activation layers

Usage

LeakyReLU(alpha = 0.3, input_shape = NULL)

PReLU(input_shape = NULL)

ELU(alpha = 1, input_shape = NULL)

ThresholdedReLU(theta = 1, input_shape = NULL)

Arguments

alpha

float >= 0. Negative slope coefficient in LeakyReLU and scale for the negative factor in ELU.

input_shape

only need when first layer of a model; sets the input shape of the data

theta

float >= 0. Threshold location of activation in ThresholdedReLU.

Author(s)

Taylor B. Arnold, taylor.arnold@acm.org

References

Chollet, Francois. 2015. Keras: Deep Learning library for Theano and TensorFlow.

See Also

Other layers: Activation, ActivityRegularization, BatchNormalization, Conv, Dense, Dropout, Embedding, Flatten, GaussianNoise, LayerWrapper, LocallyConnected, Masking, MaxPooling, Permute, RNN, RepeatVector, Reshape, Sequential

Examples

if(keras_available()) {
  X_train <- matrix(rnorm(100 * 10), nrow = 100)
  Y_train <- to_categorical(matrix(sample(0:2, 100, TRUE), ncol = 1), 3)
  
  mod <- Sequential()
  mod$add(Dense(units = 50, input_shape = dim(X_train)[2]))
  mod$add(LeakyReLU(alpha = 0.4))
  mod$add(Dense(units = 50))
  mod$add(ELU(alpha = 0.5))
  mod$add(Dense(units = 50))
  mod$add(ThresholdedReLU(theta = 1.1))
  mod$add(Dense(units = 3))
  mod$add(Activation("softmax"))
  keras_compile(mod,  loss = 'categorical_crossentropy', optimizer = RMSprop())
  
  keras_fit(mod, X_train, Y_train, batch_size = 32, epochs = 5, verbose = 0)
}

kerasR documentation built on Aug. 17, 2022, 5:06 p.m.