knitr::opts_chunk$set(
  collapse = TRUE,
  comment = "#>",
  fig.path = "man/figures/README-",
  out.width = "100%"
)

easy.mlp

The goal of easy.mlp is to quickly and easily build a neural network to fit tabular data.

Installation

You can install the released version of easy.mlp from github with:

library(devtools)

install_github("Greg-Hallenbeck/easy.mlp")

Example

The following code intializes a neural network based on the Fisher Iris data.

library(easy.mlp)

data(iris)
set.seed(8675309)
net <- create.mlp(Species ~ ., data=iris, hidden=c(5,5,5), type="classification")

The model then must be trained. This can be repeated multiple times, each time training for another n.epochs epochs.

n.epochs <- 1200

net <- train(net, n.epochs)

Around 1200 epochs, the training and validation loss begin to diverge:

par(mfrow=c(1,2))
options(repr.plot.width=10, repr.plot.height=5.5)

plot(net, ylim=c(0.03, 2))
plot(net, metric="accuracy", ylim=c(0,1))

The predict() function makes prediction for the training or validation data as well as new data. For classification tasks, it can return either numeric probabilities for each category or labels, as requested.

new.obs <- data.frame(Sepal.Length=4.5, Sepal.Width=4.0, Petal.Length=2.0, Petal.Width=1.5)

predict(net, newdata=new.obs, type="labels")


Greg-Hallenbeck/easy.mlp documentation built on March 10, 2023, 6:31 a.m.