optimg: General-Purpose Gradient-Based Optimization

Provides general purpose tools for helping users to implement steepest gradient descent methods for function optimization; for details see Ruder (2016) <arXiv:1609.04747v2>. Currently, the Steepest 2-Groups Gradient Descent and the Adaptive Moment Estimation (Adam) are the methods implemented. Other methods will be implemented in the future.

Getting started

Package details

AuthorVithor Rosa Franco <vithorfranco@gmail.com>
MaintainerVithor Rosa Franco <vithorfranco@gmail.com>
URL https://github.com/vthorrf/optimg
Package repositoryView on CRAN
Installation Install the latest version of this package by entering the following in R:

Try the optimg package in your browser

Any scripts or data that you put into this service are public.

optimg documentation built on Oct. 7, 2021, 5:09 p.m.