SARAHPlus: Stochastic Recursive Gradient Algorithm+ (SARAH+) Method...

Description Usage Arguments Details Value References See Also Examples

Description

A function to build prediction model using SARAH+ method.

Usage

1
2
SARAHPlus(dataTrain, alpha = 0.1, maxIter = 10, innerIter = 10,
  gammaS = 0.125, seed = NULL)

Arguments

dataTrain

a data.frame that representing training data (m \times n), where m is the number of instances and n is the number of variables where the last column is the output variable. dataTrain must have at least two columns and ten rows of data that contain only numbers (integer or float).

alpha

a float value representing learning rate. Default value is 0.1

maxIter

the maximal number of iterations in outerloop.

innerIter

the maximal number of iterations in innerloop.

gammaS

a float value to provide sufficient reduction. Default value is 0.125

seed

a integer value for static random. Default value is NULL, which means the function will not do static random.

Details

This function is practical variant of SARAH, SARAHPlus provides a possibility of earlier termination and unnecessary careful choices of maximum innerloop size, and it also covers the classical gradient descent when we set gammaS = 1 (since the while loop does not proceed).

Value

a vector matrix of theta (coefficient) for linear model.

References

Lam M. Nguyen, Jie Lu, Katya Scheinberg, Martin Takac SARAH: A Novel Method for Machine Learning Problems Using Stochastic Recursive Gradient, arXiv preprint arXiv:1703.00102, (2017)

See Also

SVRG, SSGD, SARAH

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
##################################
## Learning and Build Model with SARAH+
## load R Package data
data(gradDescentRData)
## get z-factor data
dataSet <- gradDescentRData$CompressilbilityFactor
## split dataset
splitedDataSet <- splitData(dataSet)
## build model with SARAH+
SARAHPlusmodel <- SARAHPlus(splitedDataSet$dataTrain)
#show result
print(SARAHPlusmodel)

cs-upi/gradDescent documentation built on May 12, 2019, 5:45 a.m.