SVRG: Stochastic Variance Reduce Gradient (SVRG) Method Learning...

Description Usage Arguments Details Value References See Also Examples

Description

A function to build prediction model using SVRG method.

Usage

1
2
SVRG(dataTrain, alpha = 0.1, maxIter = 10, innerIter = 10, option = 2,
  seed = NULL)

Arguments

dataTrain

a data.frame that representing training data (m \times n), where m is the number of instances and n is the number of variables where the last column is the output variable. dataTrain must have at least two columns and ten rows of data that contain only numbers (integer or float).

alpha

a float value representing learning rate. Default value is 0.1

maxIter

the maximal number of iterations in outerloop.

innerIter

the maximal number of iterations in innerloop.

option

is an option to set the theta. option 1 set the theta with the last theta in innerloop. option 2 set the theta with random theta from 1 to last innerloop.

seed

a integer value for static random. Default value is NULL, which means the function will not do static random.

Details

This function based on SGD with an optimization that accelerates the process toward converging by reducing the gradient in SGD

Value

a vector matrix of theta (coefficient) for linear model.

References

Rie Johnson, Tong Zang Accelerating Stochastic Gradient Descent using Predictive Variance Reduction, Advances in Neural Information Processing Systems, pp. 315-323 (2013)

See Also

SSGD, SARAH, SARAHPlus

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
##################################
## Learning and Build Model with SVRG
## load R Package data
data(gradDescentRData)
## get z-factor data
dataSet <- gradDescentRData$CompressilbilityFactor
## split dataset
splitedDataSet <- splitData(dataSet)
## build model with SVRG
SVRGmodel <- SVRG(splitedDataSet$dataTrain)
#show result
print(SVRGmodel)

gradDescent documentation built on May 2, 2019, 9:42 a.m.