SGD: Stochastic gradient descent

View source: R/SGD.R

SGDR Documentation

Stochastic gradient descent

Description

Run a stochastic gradient descent using unbiased score estimator.

Usage

SGD(
  model,
  theta_initial,
  observations,
  nparticles,
  resampling_threshold = 1,
  coupled2_resampling,
  coupled4_resampling,
  k = 0,
  m = 1,
  minimum_level,
  maximum_level,
  level_distribution,
  learning_rate = 0.001,
  stopping_threshold = 1e-04,
  max_iterations = 1e+06,
  mcmc_iter = 0
)

Arguments

model

a list representing a hidden Markov model, e.g. hmm_ornstein_uhlenbeck

theta_initial

an initial vector of parameters

observations

a matrix of observations of size nobservations x ydimension

nparticles

number of particles

resampling_threshold

ESS proportion below which resampling is triggered (always resample at observation times by default)

coupled2_resampling

a 2-way coupled resampling scheme, such as coupled2_maximal_coupled_residuals

coupled4_resampling

a 4-way coupled resampling scheme, such as coupled4_maximal_coupled_residuals

k

iteration at which to start averaging (default to 0)

m

iteration at which to stop averaging (default to 1)

minimum_level

coarsest discretization level

maximum_level

finest discretization level

level_distribution

list containing mass_function and tail_function that specify the distribution of levels, e.g. by calling compute_level_distribution

learning_rate

stepsize of the SGD algorithm

stopping_threshold

criterion to terminate iterations

max_iterations

maximum number of SGD iterations

mcmc_iter

use unbiased estimate if mcmc_iter == 0 or mcmc estimate at maximum_level with mcmc_iter iterations if mcmc_iter > 0

Value

a list with objects such as: theta parameters at the last SGD iteration; trajectory parameters across the SGD iterations.


jeremyhengjm/UnbiasedScore documentation built on Nov. 17, 2023, 1:48 a.m.