An implementation of various learning algorithms based on Gradient Descent for dealing with regression tasks. The variants of gradient descent algorithm are : MiniBatch Gradient Descent (MBGD), which is an optimization to use training data partially to reduce the computation load. Stochastic Gradient Descent (SGD), which is an optimization to use a random data in learning to reduce the computation load drastically. Stochastic Average Gradient (SAG), which is a SGDbased algorithm to minimize stochastic step to average. Momentum Gradient Descent (MGD), which is an optimization to speedup gradient descent learning. Accelerated Gradient Descent (AGD), which is an optimization to accelerate gradient descent learning. Adagrad, which is a gradientdescentbased algorithm that accumulate previous cost to do adaptive learning. Adadelta, which is a gradientdescentbased algorithm that use hessian approximation to do adaptive learning. RMSprop, which is a gradientdescentbased algorithm that combine Adagrad and Adadelta adaptive learning ability. Adam, which is a gradientdescentbased algorithm that mean and variance moment to do adaptive learning. Stochastic Variance Reduce Gradient (SVRG), which is an optimization SGDbased algorithm to accelerates the process toward converging by reducing the gradient. Semi Stochastic Gradient Descent (SSGD),which is a SGDbased algorithm that combine GD and SGD to accelerates the process toward converging by choosing one of the gradients at a time. Stochastic Recursive Gradient Algorithm (SARAH), which is an optimization algorithm similarly SVRG to accelerates the process toward converging by accumulated stochastic information. Stochastic Recursive Gradient Algorithm+ (SARAHPlus), which is a SARAH practical variant algorithm to accelerates the process toward converging provides a possibility of earlier termination.
Package details 


Author  Galih Praja Wijaya, Dendi Handian, Imam Fachmi Nasrulloh, Lala Septem Riza, Rani Megasari, Enjun Junaeti 
Date of publication  20180125 13:33:54 UTC 
Maintainer  Lala Septem Riza <[email protected]> 
License  GPL (>= 2)  file LICENSE 
Version  3.0 
URL  https://github.com/drizzersilverberg/gradDescentR 
Package repository  View on GitHub 
Installation 
Install the latest version of this package by entering the following in R:

Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.