Nothing
Provides general purpose tools for helping users to implement steepest gradient descent methods for function optimization; for details see Ruder (2016) <arXiv:1609.04747v2>. Currently, the Steepest 2-Groups Gradient Descent and the Adaptive Moment Estimation (Adam) are the methods implemented. Other methods will be implemented in the future.
Package details |
|
---|---|
Author | Vithor Rosa Franco <vithorfranco@gmail.com> |
Maintainer | Vithor Rosa Franco <vithorfranco@gmail.com> |
License | GPL-3 |
Version | 0.1.2 |
URL | https://github.com/vthorrf/optimg |
Package repository | View on CRAN |
Installation |
Install the latest version of this package by entering the following in R:
|
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.