sae.dnn.train: Training a Deep neural network with weights initialized by...

View source: R/sae_dnn_train.R

sae.dnn.trainR Documentation

Training a Deep neural network with weights initialized by Stacked AutoEncoder

Description

Training a Deep neural network with weights initialized by Stacked AutoEncoder

Usage

sae.dnn.train(x, y, hidden = c(1), activationfun = "sigm", learningrate = 0.8, 
    momentum = 0.5, learningrate_scale = 1, output = "sigm", sae_output = "linear", 
    numepochs = 3, batchsize = 100, hidden_dropout = 0, visible_dropout = 0)

Arguments

x

matrix of x values for examples

y

vector or matrix of target values for examples

hidden

vector for number of units of hidden layers.Default is c(10).

activationfun

activation function of hidden unit.Can be "sigm","linear" or "tanh".Default is "sigm" for logistic function

learningrate

learning rate for gradient descent. Default is 0.8.

momentum

momentum for gradient descent. Default is 0.5 .

learningrate_scale

learning rate will be mutiplied by this scale after every iteration. Default is 1 .

numepochs

number of iteration for samples Default is 3.

batchsize

size of mini-batch. Default is 100.

output

function of output unit, can be "sigm","linear" or "softmax". Default is "sigm".

sae_output

function of autoencoder output unit, can be "sigm","linear" or "softmax". Default is "linear".

hidden_dropout

drop out fraction for hidden layer. Default is 0.

visible_dropout

drop out fraction for input layer Default is 0.

Author(s)

Xiao Rong

Examples

Var1 <- c(rnorm(50, 1, 0.5), rnorm(50, -0.6, 0.2))
Var2 <- c(rnorm(50, -0.8, 0.2), rnorm(50, 2, 1))
x <- matrix(c(Var1, Var2), nrow = 100, ncol = 2)
y <- c(rep(1, 50), rep(0, 50))
dnn <- sae.dnn.train(x, y, hidden = c(5, 5))
## predict by dnn
test_Var1 <- c(rnorm(50, 1, 0.5), rnorm(50, -0.6, 0.2))
test_Var2 <- c(rnorm(50, -0.8, 0.2), rnorm(50, 2, 1))
test_x <- matrix(c(test_Var1, test_Var2), nrow = 100, ncol = 2)
nn.test(dnn, test_x, y)

deepnet documentation built on June 24, 2022, 5:06 p.m.