nnet | R Documentation |
Fit a feed forward neural network using MCMC.
BayesNnet(formula,
hidden.layers,
niter,
data,
subset,
prior = NULL,
expected.model.size = Inf,
drop.unused.levels = TRUE,
contrasts = NULL,
ping = niter / 10,
seed = NULL)
HiddenLayer(number.of.nodes, prior = NULL, expected.model.size = Inf)
formula |
A formula describing the model to be fit. The formula should be additive. The network will figure out any interactions or nonlinearities. |
A list of objects created by | |
niter |
The number of MCMC iterations to run. Be sure to include enough so you can throw away a burn-in set. |
data |
An optional data frame, list or environment (or object coercible by
'as.data.frame' to a data frame) containing the variables in the
model. If not found in 'data', the variables are taken from
'environment(formula)', typically the environment from which
|
subset |
an optional vector specifying a subset of observations to be used in the fitting process. |
prior |
When passed to When passed to |
expected.model.size |
When The parameter is used to set the prior inclusion probabilities for
the coefficients. If |
drop.unused.levels |
Logical indicating whether unobserved factor levels should be dropped when forming the model matrix. |
contrasts |
An optional list. See the |
ping |
The frequency with which to print status update messages
to the screen. For example, if |
seed |
An integer to use as the random seed for the underlying
C++ code. If |
number.of.nodes |
The number of nodes in this hidden layer. This must be a positive scalar integer. |
The model is a feedforward neural network regression. The model is fit using an MCMC algorithm based on data augmentation. Each hidden node is randomly assigned a 0/1 value from its full conditional distribution. Then conditional on the imputed data an MCMC draw is done on each latent logistic regression and on the regression model defining the terminal node.
The returned object is a list with class BayesNnet
. It
contains the following objects
residual.sd
The standard deviation of the residuals
from the model.
hidden.layer.coefficients
A list, with one element per
hidden layer, giving the posterior draws of the hidden layer
coefficients for that layer. Each list element is a 3-way array
with dimensions corresponding to
MCMC iteration
Input node. For the first hidden layer each 'input node' is a predictor variable.
Output node.
You can think of hidden.layer.coefficients[[i]][, , j] as the posterior distribution of the logistic regression model defining node 'j' in hidden layer 'i'.
terminal.layer.coefficients
A matrix containing the
MCMC draws of the model coefficients for the terminal layer.
Other list elements needed to implement various methods (predict, plot, etc.).
Steven L. Scott
??
plot.BayesNnet
,
predict.BayesNnet
.
if (require(mlbench)) {
data(BostonHousing)
hidden.layers <- list(
HiddenLayer(10, expected.model.size = Inf))
## In real life you'd want more 50 MCMC draws.
model <- BayesNnet(medv ~ .,
hidden.layers = hidden.layers,
niter = 50,
data = BostonHousing)
par(mfrow = c(1, 2))
plot(model) # plots predicted vs actual.
plot(model, "residual") # plots
par(mfrow = c(1,1))
plot(model, "structure")
## Examine all partial dependence plots.
plot(model, "partial", pch = ".")
## Examine a single partial dependence plot.
par(mfrow = c(1,1))
plot(model, "lstat", pch = ".")
## Check out the mixing performance.
PlotManyTs(model$terminal.layer.coefficients)
PlotMacf(model$terminal.layer.coefficients)
## Get the posterior distribution of the function values for the
## training data.
pred <- predict(model)
## Get predictions for data at new points (though in this example I'm
## reusing old points.
pred2 <- predict(model, newdata = BostonHousing[1:12, ])
} else {
cat("The Boston housing data from 'mlbench' is needed for this example.")
}
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.