gradDescentR: Gradient Descent for Regression Tasks

An implementation of various learning algorithms based on Gradient Descent for dealing with regression tasks. The variants of gradient descent algorithm are : Mini-Batch Gradient Descent (MBGD), an optimization to use training data partially to reduce the computation load. Stochastic Gradient Descent (SGD), an optimization to use a random data in learning to reduce the computation load drastically. Stochastic Average Gradient (SAG), a SGD-based algorithm to minimize stochastic step to average. Momentum Gradient Descent (MGD), an optimization to speed-up gradient descent learning. Accelerated Gradient Descent (AGD), an optimization to accelerate gradient descent learning. Adagrad, a gradient-descent-based algorithm that accumulate previous cost to do adaptive learning. Adadelta, a gradient-descent-based algorithm that use hessian approximation to do adaptive learning. RMSprop, a gradient-descent-based algorithm that combine Adagrad and Adadelta adaptive learning ability. Adam, a gradient-descent-based algorithm that mean and variance moment to do adaptive learning.

Getting started

Package details

AuthorDendi Handian, Imam Fachmi Nasrulloh, Lala Septem Riza
MaintainerLala Septem Riza <lala.s.riza@upi.edu>
LicenseGPL (>= 2) | file LICENSE
Version2.0.1
URL https://github.com/drizzersilverberg/gradDescentR
Package repositoryView on CRAN
Installation Install the latest version of this package by entering the following in R:
install.packages("gradDescentR")

Try the gradDescentR package in your browser

Any scripts or data that you put into this service are public.

gradDescentR documentation built on March 9, 2017, 9:02 a.m.