Description Details Author(s) References Examples

Training of general classification and regression neural networks using gradient descent. Special features include a function for training autoencoders. Multiple activation and cost functions (including Huber and pseudo-Huber) are supported, as well as L1 and L2 regularization, momentum, early stopping and the possibility to specify a learning rate schedule. The package contains a vectorized gradient descent implementation which facilitates faster training through batch learning.

Package for training neural networks. Special options for detecting and plotting anomalies using autoencoding neural networks.

Bart Lammers

Maintainer: Bart Lammers <[email protected]>

Add links to references. Efficient Backprop Le Cun

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 | ```
# Example on iris dataset:
# Plot full data
plot(iris, pch = as.numeric(iris$Species))
# Prepare test and train sets
random_draw <- sample(1:nrow(iris), size = 100)
X_train <- iris[random_draw, 1:4]
Y_train <- iris[random_draw, 5]
X_test <- iris[setdiff(1:nrow(iris), randDraw), 1:4]
Y_test <- iris[setdiff(1:nrow(iris), randDraw), 5]
# Train neural network on classification task
NN <- neuralnetwork(X = X_train, Y = Y_train, hidden.layers = c(5, 5),
optim.type = 'adam', learn.rates = 0.01, val.prop = 0)
# Plot the loss during training
plot(NN)
# Make predictions
Y_pred <- predict(NN, newdata = X_test)
# Plot predictions
plot(X_test, pch = as.numeric(Y_test), col = (Y_test == Y_pred$predictions) + 2)
``` |

Embedding an R snippet on your website

Add the following code to your website.

For more information on customizing the embed code, read Embedding Snippets.