tathagatabasu/SubgradOptim: Optimisation with piece-wise differentiable functions.

This package concerns with optimisation for piece-wise differentiable functions. Here, three seperate methods can be found with one of them having an accelarated variant. The three methods are subgradient method, proximal gradient method and coordinate descent algorithm. The proximal gradient method has an accelarated varient which can be found as a flag in the input parameters.

Getting started

Package details

Maintainer
LicenseGPL 3
Version1.0.0.0000
Package repositoryView on GitHub
Installation Install the latest version of this package by entering the following in R:
install.packages("remotes")
remotes::install_github("tathagatabasu/SubgradOptim")
tathagatabasu/SubgradOptim documentation built on Nov. 5, 2019, 10 a.m.