An implementation of various learning algorithms based on Gradient Descent for dealing with regression tasks. The variants of gradient descent algorithm are : MiniBatch Gradient Descent (MBGD), an optimization to use training data partially to reduce the computation load. Stochastic Gradient Descent (SGD), an optimization to use a random data in learning to reduce the computation load drastically. Stochastic Average Gradient (SAG), a SGDbased algorithm to minimize stochastic step to average. Momentum Gradient Descent (MGD), an optimization to speedup gradient descent learning. Accelerated Gradient Descent (AGD), an optimization to accelerate gradient descent learning. Adagrad, a gradientdescentbased algorithm that accumulate previous cost to do adaptive learning. Adadelta, a gradientdescentbased algorithm that use hessian approximation to do adaptive learning. RMSprop, a gradientdescentbased algorithm that combine Adagrad and Adadelta adaptive learning ability. Adam, a gradientdescentbased algorithm that mean and variance moment to do adaptive learning.
Package details 


Author  Dendi Handian, Imam Fachmi Nasrulloh, Lala Septem Riza 
Maintainer  Lala Septem Riza <[email protected]> 
License  GPL (>= 2)  file LICENSE 
Version  2.0.1 
URL  https://github.com/drizzersilverberg/gradDescentR 
Package repository  View on CRAN 
Installation 
Install the latest version of this package by entering the following in R:

Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.