View source: R/post_correction.R
post_correct | R Documentation |
Function post_correct
updates previously obtained approximate MCMC
output with post-correction weights leading to asymptotically exact
weighted posterior, and returns updated MCMC output where components
weights
, posterior
, alpha
, alphahat
, and
Vt
are updated (depending on the original output type).
post_correct( model, mcmc_output, particles, threads = 1L, is_type = "is2", seed = sample(.Machine$integer.max, size = 1) )
model |
Model of class |
mcmc_output |
An output from |
particles |
Number of particles for ψ-APF (positive integer).
Suitable values depend on the model and the data, but often relatively
small value less than say 50 is enough. See also |
threads |
Number of parallel threads (positive integer, default is 1). |
is_type |
Type of IS-correction. Possible choices are
|
seed |
Seed for the C++ RNG (positive integer). |
The original object of class mcmc_output
with updated
weights, log-posterior values and state samples or summaries (depending on
the mcmc_output$mcmc_type
).
Doucet A, Pitt M K, Deligiannidis G, Kohn R (2018). Efficient implementation of Markov chain Monte Carlo when using an unbiased likelihood estimator. Biometrika, 102, 2, 295-313, https://doi.org/10.1093/biomet/asu075
Vihola M, Helske J, Franks J (2020). Importance sampling type estimators based on approximate marginal Markov chain Monte Carlo. Scand J Statist. 1-38. https://doi.org/10.1111/sjos.12492
set.seed(1) n <- 300 x1 <- sin((2 * pi / 12) * 1:n) x2 <- cos((2 * pi / 12) * 1:n) alpha <- numeric(n) alpha[1] <- 0 rho <- 0.7 sigma <- 2 mu <- 1 for(i in 2:n) { alpha[i] <- rnorm(1, mu * (1 - rho) + rho * alpha[i-1], sigma) } u <- rpois(n, 50) y <- rbinom(n, size = u, plogis(0.5 * x1 + x2 + alpha)) ts.plot(y / u) model <- ar1_ng(y, distribution = "binomial", rho = uniform(0.5, -1, 1), sigma = gamma_prior(1, 2, 0.001), mu = normal(0, 0, 10), xreg = cbind(x1,x2), beta = normal(c(0, 0), 0, 5), u = u) out_approx <- run_mcmc(model, mcmc_type = "approx", local_approx = FALSE, iter = 50000) out_is2 <- post_correct(model, out_approx, particles = 30, threads = 2) out_is2$time summary(out_approx, return_se = TRUE) summary(out_is2, return_se = TRUE) # latent state library("dplyr") library("ggplot2") state_approx <- as.data.frame(out_approx, variable = "states") %>% group_by(time) %>% summarise(mean = mean(value)) state_exact <- as.data.frame(out_is2, variable = "states") %>% group_by(time) %>% summarise(mean = weighted.mean(value, weight)) dplyr::bind_rows(approx = state_approx, exact = state_exact, .id = "method") %>% filter(time > 200) %>% ggplot(aes(time, mean, colour = method)) + geom_line() + theme_bw() # posterior means p_approx <- predict(out_approx, model, type = "mean", nsim = 1000, future = FALSE) %>% group_by(time) %>% summarise(mean = mean(value)) p_exact <- predict(out_is2, model, type = "mean", nsim = 1000, future = FALSE) %>% group_by(time) %>% summarise(mean = mean(value)) dplyr::bind_rows(approx = p_approx, exact = p_exact, .id = "method") %>% filter(time > 200) %>% ggplot(aes(time, mean, colour = method)) + geom_line() + theme_bw()
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.