run_mcmc  R Documentation 
Adaptive Markov chain Monte Carlo simulation for SSMs using Robust Adaptive Metropolis algorithm by Vihola (2012). Several different MCMC sampling schemes are implemented, see parameter arguments, package vignette, Vihola, Helske, Franks (2020) and Helske and Vihola (2021) for details.
run_mcmc(model, ...) ## S3 method for class 'lineargaussian' run_mcmc( model, iter, output_type = "full", burnin = floor(iter/2), thin = 1, gamma = 2/3, target_acceptance = 0.234, S, end_adaptive_phase = FALSE, threads = 1, seed = sample(.Machine$integer.max, size = 1), verbose, ... ) ## S3 method for class 'nongaussian' run_mcmc( model, iter, particles, output_type = "full", mcmc_type = "is2", sampling_method = "psi", burnin = floor(iter/2), thin = 1, gamma = 2/3, target_acceptance = 0.234, S, end_adaptive_phase = FALSE, local_approx = TRUE, threads = 1, seed = sample(.Machine$integer.max, size = 1), max_iter = 100, conv_tol = 1e08, verbose, ... ) ## S3 method for class 'ssm_nlg' run_mcmc( model, iter, particles, output_type = "full", mcmc_type = "is2", sampling_method = "bsf", burnin = floor(iter/2), thin = 1, gamma = 2/3, target_acceptance = 0.234, S, end_adaptive_phase = FALSE, threads = 1, seed = sample(.Machine$integer.max, size = 1), max_iter = 100, conv_tol = 1e08, iekf_iter = 0, verbose, ... ) ## S3 method for class 'ssm_sde' run_mcmc( model, iter, particles, output_type = "full", mcmc_type = "is2", L_c, L_f, burnin = floor(iter/2), thin = 1, gamma = 2/3, target_acceptance = 0.234, S, end_adaptive_phase = FALSE, threads = 1, seed = sample(.Machine$integer.max, size = 1), verbose, ... )
model 
Model of class 
... 
Ignored. 
iter 
A positive integer defining the total number of MCMC iterations.
Suitable value depends on the model, data, and the choice of specific
algorithms ( 
output_type 
Either 
burnin 
A positive integer defining the length of the burnin period
which is disregarded from the results. Defaults to 
thin 
A positive integer defining the thinning rate. All the MCMC
algorithms in 
gamma 
Tuning parameter for the adaptation of RAM algorithm. Must be between 0 and 1. 
target_acceptance 
Target acceptance rate for MCMC. Defaults to 0.234. Must be between 0 and 1. 
S 
Matrix defining the initial value for the lower triangular matrix of the RAM algorithm, so that the covariance matrix of the Gaussian proposal distribution is SS'. Note that for some parameters (currently the standard deviation, dispersion, and autoregressive parameters of the BSM and AR(1) models) the sampling is done in unconstrained parameter space, i.e. internal_theta = log(theta) (and logit(rho) or AR coefficient). 
end_adaptive_phase 
Logical, if 
threads 
Number of threads for state simulation. Positive integer (default is 1). Note that parallel computing is only used in the postcorrection phase of ISMCMC and when sampling the states in case of (approximate) Gaussian models. 
seed 
Seed for the C++ RNG (positive integer). 
verbose 
If 
particles 
A positive integer defining the number of state samples per
MCMC iteration for models other than linearGaussian models.
Ignored if 
mcmc_type 
What type of MCMC algorithm should be used for models other
than linearGaussian models? Possible choices are

sampling_method 
Method for state sampling when for models other than
linearGaussian models. If 
local_approx 
If 
max_iter 
Maximum number of iterations used in Gaussian approximation, as a positive integer. Default is 100 (although typically only few iterations are needed). 
conv_tol 
Positive tolerance parameter used in Gaussian approximation. 
iekf_iter 
Nonnegative integer. The default zero corresponds to
normal EKF, whereas 
L_c, L_f 
For 
For linearGaussian models, option "summary"
does not simulate
states directly but computes the posterior means and variances of states
using fast Kalman smoothing. This is slightly faster,
more memory efficient and more accurate than calculations based on
simulation smoother. In other cases, the means and
covariances are computed using the full output of particle filter
instead of subsampling one of these as in case of
output_type = "full"
. The states are sampled up to the time point n+1
where n is the length of the input time series i.e. the last values are
onestepahead predictions. (for predicting further, see
?predict.mcmc_output
).
Initial values for the sampling are taken from the model object
(model$theta
). If you want to continue from previous run, you can
reconstruct your original model by plugging in the previously obtained
parameters to model$theta
, providing the S matrix for the RAM
algorithm and setting burnin = 0
. See example. Note however, that
this is not identical as running all the iterations once, due to the
RNG "discontinuity" and because even without burnin bssm does include
"theta_0" i.e. the initial theta in the final chain (even with
burnin=0
).
An object of class mcmc_output
.
Vihola M (2012). Robust adaptive Metropolis algorithm with coerced acceptance rate. Statistics and Computing, 22(5), p 9971008. https://doi.org/10.1007/s1122201192695
Vihola, M, Helske, J, Franks, J (2020). Importance sampling type estimators based on approximate marginal Markov chain Monte Carlo. Scand J Statist. 138. https://doi.org/10.1111/sjos.12492
Helske J, Vihola M (2021). bssm: Bayesian Inference of Nonlinear and NonGaussian State Space Models in R. The R Journal (2021) 13:2, 578589. https://doi.org/10.32614/RJ2021103
Vihola, M, Helske, J, Franks, J. Importance sampling type estimators based on approximate marginal Markov chain Monte Carlo. Scand J Statist. 2020; 138. https://doi.org/10.1111/sjos.12492
model < ar1_lg(LakeHuron, rho = uniform(0.5,1,1), sigma = halfnormal(1, 10), mu = normal(500, 500, 500), sd_y = halfnormal(1, 10)) mcmc_results < run_mcmc(model, iter = 2e4) summary(mcmc_results, return_se = TRUE) sumr < summary(mcmc_results, variable = "states") library("ggplot2") ggplot(sumr, aes(time, Mean)) + geom_ribbon(aes(ymin = `2.5%`, ymax = `97.5%`), alpha = 0.25) + geom_line() + theme_bw() + geom_point(data = data.frame(Mean = LakeHuron, time = time(LakeHuron)), col = 2) # Continue from the previous run model$theta[] < mcmc_results$theta[nrow(mcmc_results$theta), ] run_more < run_mcmc(model, S = mcmc_results$S, iter = 1000, burnin = 0) set.seed(1) n < 50 slope < cumsum(c(0, rnorm(n  1, sd = 0.001))) level < cumsum(slope + c(0, rnorm(n  1, sd = 0.2))) y < rpois(n, exp(level)) poisson_model < bsm_ng(y, sd_level = halfnormal(0.01, 1), sd_slope = halfnormal(0.01, 0.1), P1 = diag(c(10, 0.1)), distribution = "poisson") # Note small number of iterations for CRAN checks mcmc_out < run_mcmc(poisson_model, iter = 1000, particles = 10, mcmc_type = "da") summary(mcmc_out, what = "theta", return_se = TRUE) set.seed(123) n < 50 sd_level < 0.1 drift < 0.01 beta < 0.9 phi < 5 level < cumsum(c(5, drift + rnorm(n  1, sd = sd_level))) x < 3 + (1:n) * drift + sin(1:n + runif(n, 1, 1)) y < rnbinom(n, size = phi, mu = exp(beta * x + level)) model < bsm_ng(y, xreg = x, beta = normal(0, 0, 10), phi = halfnormal(1, 10), sd_level = halfnormal(0.1, 1), sd_slope = halfnormal(0.01, 0.1), a1 = c(0, 0), P1 = diag(c(10, 0.1)^2), distribution = "negative binomial") # run ISMCMC # Note small number of iterations for CRAN checks fit < run_mcmc(model, iter = 4000, particles = 10, mcmc_type = "is2", seed = 1) # extract states d_states < as.data.frame(fit, variable = "states", time = 1:n) library("dplyr") library("ggplot2") # compute summary statistics level_sumr < d_states %>% filter(variable == "level") %>% group_by(time) %>% summarise(mean = diagis::weighted_mean(value, weight), lwr = diagis::weighted_quantile(value, weight, 0.025), upr = diagis::weighted_quantile(value, weight, 0.975)) # visualize level_sumr %>% ggplot(aes(x = time, y = mean)) + geom_line() + geom_line(aes(y = lwr), linetype = "dashed", na.rm = TRUE) + geom_line(aes(y = upr), linetype = "dashed", na.rm = TRUE) + theme_bw() + theme(legend.title = element_blank()) + xlab("Time") + ylab("Level") # theta d_theta < as.data.frame(fit, variable = "theta") ggplot(d_theta, aes(x = value)) + geom_density(aes(weight = weight), adjust = 2, fill = "#92f0a8") + facet_wrap(~ variable, scales = "free") + theme_bw() # Bivariate Poisson model: set.seed(1) x < cumsum(c(3, rnorm(19, sd = 0.5))) y < cbind( rpois(20, exp(x)), rpois(20, exp(x))) prior_fn < function(theta) { # halfnormal prior using transformation dnorm(exp(theta), 0, 1, log = TRUE) + theta # plus jacobian term } update_fn < function(theta) { list(R = array(exp(theta), c(1, 1, 1))) } model < ssm_mng(y = y, Z = matrix(1,2,1), T = 1, R = 0.1, P1 = 1, distribution = "poisson", init_theta = log(0.1), prior_fn = prior_fn, update_fn = update_fn) # Note small number of iterations for CRAN checks out < run_mcmc(model, iter = 4000, mcmc_type = "approx") sumr < as.data.frame(out, variable = "states") %>% group_by(time) %>% mutate(value = exp(value)) %>% summarise(mean = mean(value), ymin = quantile(value, 0.05), ymax = quantile(value, 0.95)) ggplot(sumr, aes(time, mean)) + geom_ribbon(aes(ymin = ymin, ymax = ymax),alpha = 0.25) + geom_line() + geom_line(data = data.frame(mean = y[, 1], time = 1:20), colour = "tomato") + geom_line(data = data.frame(mean = y[, 2], time = 1:20), colour = "tomato") + theme_bw()
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.