Description Usage Arguments Author(s) Examples

View source: R/dbn_dnn_train.R

Training a Deep neural network with weights initialized by DBN

1 2 3 4 | ```
dbn.dnn.train(x, y, hidden = c(1), activationfun = "sigm",
learningrate = 0.8, momentum = 0.5, learningrate_scale = 1,
output = "sigm", numepochs = 3, batchsize = 100, hidden_dropout = 0,
visible_dropout = 0, cd = 1)
``` |

`x` |
matrix of x values for examples |

`y` |
vector or matrix of target values for examples |

`hidden` |
vector for number of units of hidden layers.Default is c(10). |

`activationfun` |
activation function of hidden unit.Can be "sigm","linear" or "tanh".Default is "sigm" for logistic function |

`learningrate` |
learning rate for gradient descent. Default is 0.8. |

`momentum` |
momentum for gradient descent. Default is 0.5 . |

`learningrate_scale` |
learning rate will be mutiplied by this scale after every iteration. Default is 1 . |

`output` |
function of output unit, can be "sigm","linear" or "softmax". Default is "sigm". |

`numepochs` |
number of iteration for samples Default is 3. |

`batchsize` |
size of mini-batch. Default is 100. |

`hidden_dropout` |
drop out fraction for hidden layer. Default is 0. |

`visible_dropout` |
drop out fraction for input layer Default is 0. |

`cd` |
number of iteration for Gibbs sample of CD algorithm. |

Xiao Rong

1 2 3 4 5 6 7 8 9 10 | ```
Var1 <- c(rnorm(50,1,0.5),rnorm(50,-0.6,0.2))
Var2 <- c(rnorm(50,-0.8,0.2),rnorm(50,2,1))
x <- matrix(c(Var1,Var2),nrow=100,ncol=2)
y <- c(rep(1,50),rep(0,50))
dnn <-dbn.dnn.train(x,y,hidden=c(5,5))
## predict by dnn
test_Var1 <- c(rnorm(50,1,0.5),rnorm(50,-0.6,0.2))
test_Var2 <- c(rnorm(50,-0.8,0.2),rnorm(50,2,1))
test_x <- matrix(c(test_Var1,test_Var2),nrow=100,ncol=2)
nn.test(dnn,test_x,y)
``` |

DimitriF/DLC documentation built on May 27, 2018, 9:04 a.m.

Embedding an R snippet on your website

Add the following code to your website.

For more information on customizing the embed code, read Embedding Snippets.