tathagatabasu/nonsmoothOptim: Optimisation with piece-wise differentiable functions.

This package concerns with optimisation for piece-wise differentiable functions. Here, three seperate methods can be found with one of them having an accelarated variant. The three methods are subgradient method, proximal gradient method and coordinate descent algorithm. The proximal gradient method has an accelarated varient which can be found as a flag in the input parameters.

Getting started

Package details

LicenseGPL 3
Package repositoryView on GitHub
Installation Install the latest version of this package by entering the following in R:
tathagatabasu/nonsmoothOptim documentation built on Nov. 5, 2019, 10 a.m.