Nothing
Implements the Hierarchical Incremental GRAdient Descent (HiGrad) algorithm, a first-order algorithm for finding the minimizer of a function in online learning just like stochastic gradient descent (SGD). In addition, this method attaches a confidence interval to assess the uncertainty of its predictions. See Su and Zhu (2018) <arXiv:1802.04876> for details.
Package details |
|
|---|---|
| Author | Weijie Su [aut], Yuancheng Zhu [aut, cre] |
| Maintainer | Yuancheng Zhu <yuancheng.zhu@gmail.com> |
| License | GPL-3 |
| Version | 0.1.0 |
| Package repository | View on CRAN |
| Installation |
Install the latest version of this package by entering the following in R:
|
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.