DBN: Deep Belief Network

Description Usage Arguments Value Examples

Description

Trains a deep belief network starting with a greedy pretrained stack of RBM's (unsupervised) using the function StackRBM and then DBN adds a supervised output layer. The stacked RBM is then finetuned on the supervised criterion by using backpropogation.

Usage

1
2
3
DBN(x, y, n.iter = 300, nodes = c(30, 40, 30), learning.rate = 0.5,
  size.minibatch = 10, n.iter.pre = 30, learning.rate.pre = 0.1,
  verbose = FALSE)

Arguments

x

A matrix with binary features of shape samples * features.

y

A matrix with labels for the data. (Always needs to be provided for training the DBN)

n.iter

The number of epochs to run backpropogation.

nodes

A vector with the number of hidden nodes at each layer

learning.rate

Learning rate for supervised finetuning of Stacked RBM.

size.minibatch

The size of the minibatches used for training.

n.iter.pre

Past on to the StackRBM function, defines how many epochs are used to pretrain each RBM layer.

learning.rate.pre

The pretraining learning rate, passed on to the StackRBM function.

verbose

Whether to print th training error at each epoch, printing will slow down the fitting.

Value

Returns the finetuned DBN model that can be used in the PredictDBN function.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
# Load the MNIST dat
data(MNIST)

# Train the DBN model
modDBN <- DBN(MNIST$trainX, MNIST$trainY,n.iter = 500, nodes = c(500, 300, 150), learning.rate = 0.5,
size.minibatch = 10, n.iter.pre = 300, learning.rate.pre = 0.1, verbose = FALSE)

# Turn Verbose on to check the learning progress
modDBN <- DBN(MNIST$trainX, MNIST$trainY,n.iter = 500, nodes = c(500, 300, 150), learning.rate = 0.5,
size.minibatch = 10, n.iter.pre = 300, learning.rate.pre = 0.1, verbose = TRUE)

TimoMatzen/RBM documentation built on June 1, 2019, 8:35 a.m.