esmail-abdulfattah/smartGrad: Optimization using Smart Numerical Differentiation

A limited-memory technique for improving the accuracy of a numerically computed gradient by exploiting (1) a coordinate transforma-tion of the gradient and (2) the history of previously taken descent directions.The method is verified empirically by extensive experimentation on both testfunctions and on real data applications.

Getting started

Package details

Maintainer
License`use_mit_license()`, `use_gpl3_license()` or friends to pick a license
Version0.0.0.9000
Package repositoryView on GitHub
Installation Install the latest version of this package by entering the following in R:
install.packages("remotes")
remotes::install_github("esmail-abdulfattah/smartGrad")
esmail-abdulfattah/smartGrad documentation built on March 19, 2022, 3:01 p.m.