NeuralNetwork: Create a NeuralNetwork for further visualization

Description Usage Arguments Details Value Examples

View source: R/neural_net_class.R

Description

NeuralNetwork returns the trained neural network

Usage

1
2
NeuralNetwork(f, data, layers, scale = FALSE, options = list(store =
  FALSE, nrepetitions = 1000, probs = c(0.05, 0.95), parallel = TRUE), ...)

Arguments

f

A formula representing the model that should be fitted. Handles categorical, binary and numerical data. Specify each column separately or all with y ~ . .

data

The data that should be used for training the neural network.

layers

Vector representing the number of hidden layers that should be used.

scale

Boolean representing if the data should be scaled or not.

options

List to specify that you want to run the bootstrap sampling directly in the model creation. Then this data can be used for creating the partial dependence plots.

...

further parameters for neuralnet, see: neuralnet

Details

This is a S3 class. It defines a neural network and has the plot_partial_dependencies method for plotting marginal effects. Additionally, you can use plot, predict and summary.

Value

NeuralNetwork class containing the neuralnet, type of dependent variable, name of dependent variable, layers, min and max of each numeric column, additional parameters provided and stored data if specified.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
## Not run: 
# Example: Numeric
library(MASS)
neural_network <- NeuralNetwork(f = medv ~ ., data = Boston,
                                layers = c(5, 3), scale = TRUE,
                                linear.output = TRUE)

# Example: Categoric
library(datasets)
model <- NeuralNetwork(
   Species ~ Sepal.Length + Sepal.Width + Petal.Length + Petal.Width,
   data = iris, layers = c(10, 10), rep = 5, err.fct = "ce",
   linear.output = FALSE, lifesign = "minimal", stepmax = 1000000,
   threshold = 0.001, scale = T)

## End(Not run)

AlexAfanasev/NeuralNetworkVisualization documentation built on Sept. 23, 2019, 2:29 a.m.