avNNet: Neural Networks Using Model Averaging

Description Usage Arguments Details Value Author(s) References See Also Examples

Description

Aggregate several neural network model

Usage

1
2
3
4
5
6
7
8
9
## Default S3 method:
avNNet(x, y, repeats = 5, bag = FALSE, allowParallel = TRUE, ...)
## S3 method for class 'formula'
avNNet(formula, data, weights, ..., 
        repeats = 5, bag = FALSE, allowParallel = TRUE,
        subset, na.action, contrasts = NULL)

## S3 method for class 'avNNet'
predict(object, newdata, type = c("raw", "class", "prob"), ...)

Arguments

formula

A formula of the form class ~ x1 + x2 + ...

x

matrix or data frame of x values for examples.

y

matrix or data frame of target values for examples.

weights

(case) weights for each example – if missing defaults to 1.

repeats

the number of neural networks with with different random number seeds

bag

a logical for bagging for each repeat

allowParallel

if a parallel backend is loaded and available, should the function use it?

data

Data frame from which variables specified in formula are preferentially to be taken.

subset

An index vector specifying the cases to be used in the training sample. (NOTE: If given, this argument must be named.)

na.action

A function to specify the action to be taken if NAs are found. The default action is for the procedure to fail. An alternative is na.omit, which leads to rejection of cases with missing values on any required variable. (NOTE: If given, this argument must be named.)

contrasts

a list of contrasts to be used for some or all of the factors appearing as variables in the model formula.

object

an object of class avNNet as returned by avNNet.

newdata

matrix or data frame of test examples. A vector is considered to be a row vector comprising a single case.

type

Type of output, either: raw for the raw outputs, code for the predicted class or prob for the class probabilities.

...

arguments passed to nnet

Details

Following Ripley (1996), the same neural network model is fit using different random number seeds. All of the resulting models are used for prediction. For regression, the output from each network are averaged. For classification, the model scores are first averaged, then translated to predicted classes. Bagging can also be used to create the models.

If a parallel backend is registered, the foreach package is used to train the networks in parallel.

Value

For avNNet, an object of "avNNet" or "avNNet.formula". Items of interest in the output are:

model

a list of the models generated from nnet

repeats

an echo of the model input

names

if any predictors had only one distinct value, this is a character string of the remaining columns. Otherwise a value of NULL

Author(s)

These are heavily based on the nnet code from Brian Ripley.

References

Ripley, B. D. (1996) Pattern Recognition and Neural Networks. Cambridge.

See Also

nnet, preProcess

Examples

1
2
3
4
5
6
7
8
data(BloodBrain)
## Not run: 
modelFit <- avNNet(bbbDescr, logBBB, size = 5, linout = TRUE, trace = FALSE)
modelFit

predict(modelFit, bbbDescr)

## End(Not run)

Example output

Loading required package: lattice
Loading required package: ggplot2
Warning message:
executing %dopar% sequentially: no parallel backend registered 
Model Averaged Neural Network with 5 Repeats  

a 134-5-1 network with 681 weights
options were - linear output units 

           1            2            3            4            5            6 
 0.118045585 -0.146477030  0.081804352  0.183756025  0.183756025  0.183756025 
           7            8            9           10           11           12 
-0.082012829  0.085494428  0.183756025  0.219997218 -0.191318298  0.219997258 
          13           14           15           16           17           18 
 0.085494428 -0.180274426 -0.355135479 -0.355135479 -0.191318298 -0.191318298 
          19           20           21           22           23           24 
-0.372262268 -0.372262268  0.183756025  0.183756025  0.183756025  0.081804352 
          25           26           27           28           29           30 
-0.355135479  0.219997218  0.073701105 -0.372262268  0.183756025  0.183756025 
          31           32           33           34           35           36 
-0.066444751 -0.372262268 -0.355135479 -0.089363381  0.183756025  0.183756025 
          37           38           39           40           41           42 
 0.183756025  0.019938844  0.081804352 -0.045541627  0.219997258 -0.355135479 
          43           44           45           46           47           48 
-0.372262268  0.183756025 -0.089366625  0.219997258 -0.273285690  0.183756025 
          49           50           51           52           53           54 
 0.219997258  0.179232681  0.085494428  0.085494428  0.183756025  0.183756025 
          55           56           57           58           59           60 
 0.219997258  0.183756025  0.054520544 -0.355135479 -0.355135479  0.085494428 
          61           62           63           64           65           66 
-0.444081460  0.183756025  0.054520544 -0.089366625 -0.089366625  0.183756025 
          67           68           69           70           71           72 
 0.183756025 -0.089366625 -0.372262268  0.183756025  0.183756025  0.183756025 
          73           74           75           76           77           78 
 0.183756025  0.183756025  0.183756025  0.183756025 -0.089366625 -0.093056786 
          79           80           81           82           83           84 
 0.183756025 -0.218317559 -0.082012829  0.219997258  0.081804352  0.183756025 
          85           86           87           88           89           90 
-0.016457246  0.081804352 -0.089366625 -0.045771596  0.183756025  0.183756025 
          91           92           93           94           95           96 
 0.183756025  0.183756025 -0.355135479 -0.191318298  0.183756025  0.085494428 
          97           98           99          100          101          102 
 0.051344402  0.085494428  0.085494428 -0.082012829  0.085494428  0.183756025 
         103          104          105          106          107          108 
 0.085494428  0.081804352  0.081804352 -0.355135479  0.085494428 -0.444081460 
         109          110          111          112          113          114 
-0.355135479  0.183756025  0.081804352  0.019938844 -0.191318298  0.085494428 
         115          116          117          118          119          120 
-0.093056701  0.183756025  0.085494428  0.183756025  0.085494428  0.183756025 
         121          122          123          124          125          126 
-0.082012829  0.085494428  0.056180165 -0.082012829 -0.082012829  0.081804352 
         127          128          129          130          131          132 
-0.355135479  0.183756025 -0.082012829  0.183756025  0.183756025  0.183756025 
         133          134          135          136          137          138 
 0.183756025 -0.372262268  0.183756025 -0.082012829 -0.355135479 -0.355135479 
         139          140          141          142          143          144 
-0.082012829 -0.355135479 -0.355135479  0.085494428 -0.355135479  0.183756025 
         145          146          147          148          149          150 
-0.082012829  0.183756025 -0.355135479  0.085494428  0.081804352 -0.082012829 
         151          152          153          154          155          156 
-0.180274426  0.081804352  0.081804352 -0.082012829  0.025515706 -0.355135479 
         157          158          159          160          161          162 
-0.355135479 -0.089366625 -0.256873882  0.219978952 -0.355135479  0.183756025 
         163          164          165          166          167          168 
 0.085494428  0.183756025  0.019938844  0.183756025  0.085494428 -0.016457246 
         169          170          171          172          173          174 
 0.117750630  0.085494428 -0.355135479  0.081804390 -0.081993965  0.081804352 
         175          176          177          178          179          180 
 0.183756025 -0.372262268 -0.256873882  0.183756025  0.085494428 -0.180274426 
         181          182          183          184          185          186 
-0.355135479 -0.372262268 -0.355135479  0.085494428 -0.355135479 -0.089366625 
         187          188          189          190          191          192 
-0.372262268 -0.355134897  0.081804352 -0.355135479 -0.002484978  0.183756025 
         193          194          195          196          197          198 
 0.081804352 -0.256873882  0.183756025 -0.355135479 -0.191318298 -0.191318298 
         199          200          201          202          203          204 
 0.011496601  0.183756025  0.183756025 -0.191318189 -0.372262268  0.081804352 
         205          206          207          208 
 0.183756025  0.183756025 -0.082011414 -0.256873882 

caret documentation built on May 2, 2019, 5:47 p.m.

Related to avNNet in caret...