Flatten: Flattens the input. Does not affect the batch size.

Description Usage Arguments Author(s) References See Also Examples

View source: R/layers.core.R

Description

Flattens the input. Does not affect the batch size.

Usage

1
Flatten(input_shape = NULL)

Arguments

input_shape

only need when first layer of a model; sets the input shape of the data

Author(s)

Taylor B. Arnold, taylor.arnold@acm.org

References

Chollet, Francois. 2015. Keras: Deep Learning library for Theano and TensorFlow.

See Also

Other layers: Activation, ActivityRegularization, AdvancedActivation, BatchNormalization, Conv, Dense, Dropout, Embedding, GaussianNoise, LayerWrapper, LocallyConnected, Masking, MaxPooling, Permute, RNN, RepeatVector, Reshape, Sequential

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
if(keras_available()) {
  X_train <- matrix(rnorm(100 * 10), nrow = 100)
  Y_train <- to_categorical(matrix(sample(0:2, 100, TRUE), ncol = 1), 3)

  mod <- Sequential()
  mod$add(Dense(units = 50, input_shape = dim(X_train)[2]))
  mod$add(  Dropout(rate = 0.5))
  mod$add(Activation("relu"))
  mod$add(Dense(units = 3))
  mod$add(ActivityRegularization(l1 = 1))
  mod$add(Activation("softmax"))
  keras_compile(mod,  loss = 'categorical_crossentropy', optimizer = RMSprop())

  keras_fit(mod, X_train, Y_train, batch_size = 32, epochs = 5,
            verbose = 0, validation_split = 0.2)
  
  # You can also add layers directly as arguments to Sequential()

  mod <- Sequential(
    Dense(units = 50, input_shape = ncol(X_train)),
    Dropout(rate = 0.5),
    Activation("relu"),
    Dense(units = 3),
    ActivityRegularization(l1 = 1),
    Activation("softmax")
  )
  keras_compile(mod,  loss = 'categorical_crossentropy', optimizer = RMSprop())
  
  keras_fit(mod, X_train, Y_train, batch_size = 32, epochs = 5,
            verbose = 0, validation_split = 0.2)
  
}

if (keras_available()) {
  X_train <- array(rnorm(100 * 28 * 28), dim = c(100, 28, 28, 1))
  Y_train <- to_categorical(matrix(sample(0:2, 100, TRUE), ncol = 1), 3)
  
  mod <- Sequential()
  mod$add(Conv2D(filters = 2, kernel_size = c(2, 2),
                 input_shape = c(28, 28, 1)))
  mod$add(Activation("relu"))
  mod$add(MaxPooling2D(pool_size=c(2, 2)))
  mod$add(LocallyConnected2D(filters = 2, kernel_size = c(2, 2)))
  mod$add(Activation("relu"))
  mod$add(MaxPooling2D(pool_size=c(2, 2)))
  mod$add(Dropout(0.25))
  
  mod$add(Flatten())
  mod$add(Dropout(0.5))
  mod$add(Dense(3, activation='softmax'))
  
  keras_compile(mod, loss='categorical_crossentropy', optimizer=RMSprop())
  keras_fit(mod, X_train, Y_train, verbose = 0)
}

YTLogos/kerasR documentation built on May 19, 2019, 4:04 p.m.