Description Usage Arguments Author(s) References Examples
These functions all return a named list with elements
X_train
, X_test
, Y_train
, and Y_test
. The first time
calling this function will download the datasets locally;
thereafter they will be loaded from the keras cache
directory.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 | load_cifar10()
load_cifar100(label_mode = "fine")
load_imdb(num_words = NULL, skip_top = 0, maxlen = NULL, seed = 113,
start_char = 1, oov_char = 2, index_from = 3)
load_reuters(num_words = NULL, skip_top = 0, maxlen = 1000,
test_split = 0.2, seed = 113, start_char = 1, oov_char = 2,
index_from = 3)
load_mnist()
load_boston_housing()
|
label_mode |
either "fine" or "coarse"; how to construct labels for load_cifar100. |
num_words |
integer or NULL. Top most frequent words to consider. Any less frequent word will appear as 0 in the sequence data. |
skip_top |
integer. Top most frequent words to ignore (they will appear as 0s in the sequence data). |
maxlen |
integer. Maximum sequence length. Any longer sequence will be truncated. |
seed |
integer. Seed for reproducible data shuffling. |
start_char |
integer. The start of a sequence will be marked with this character. Set to 1 because 0 is usually the padding character. |
oov_char |
integer. words that were cut out because of the num_words or skip_top limit will be replaced with this character. |
index_from |
integer. Index actual words with this index and higher. |
test_split |
float. Fraction of the dataset to use for testing. |
Taylor B. Arnold, taylor.arnold@acm.org
Chollet, Francois. 2015. Keras: Deep Learning library for Theano and TensorFlow.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 | if (keras_available()) {
boston <- load_boston_housing()
X_train <- normalize(boston$X_train, 0)
Y_train <- boston$Y_train
X_test <- normalize(boston$X_test, 0)
Y_test <- boston$Y_test
mod <- Sequential()
mod$add(Dense(units = 200, input_shape = 13))
mod$add(Activation("relu"))
mod$add(Dense(units = 200))
mod$add(Activation("relu"))
mod$add(Dense(units = 1))
keras_compile(mod, loss = 'mse', optimizer = SGD())
keras_fit(mod, scale(X_train), Y_train,
batch_size = 32, epochs = 20,
verbose = 1, validation_split = 0.1)
}
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.