Gauss_sparse_means | R Documentation |
Compute Gibbs samples from the posterior distribution of the inclusion indicators for the sparse normal means model. The inclusion probability is assigned a Beta(a_pi, b_pi) prior and is learned as well.
Gauss_sparse_means(
y,
psi = NULL,
a_pi = 1,
b_pi = 1,
nsave = 1000,
nburn = 1000,
nskip = 0,
verbose = TRUE
)
y |
|
psi |
prior variance for the slab component; if NULL, assume a Unif(0, n) prior |
a_pi |
prior shape1 parameter for the inclusion probability; default is 1 for uniform |
b_pi |
prior shape2 parameter for the inclusion probability; #' default is 1 for uniform |
nsave |
number of MCMC iterations to save |
nburn |
number of MCMC iterations to discard |
nskip |
number of MCMC iterations to skip between saving iterations, i.e., save every (nskip + 1)th draw |
verbose |
logical; if TRUE, print time remaining |
We assume sparse normal means model of the form y_i = theta_i + epsilon_i with a spike-and-slab prior on theta_i.
There are several options for the prior variance psi
.
First, it can be specified directly. Second, it can be assigned
a Uniform(0,n) prior and sampled within the MCMC
conditional on the sampled regression coefficients.
a list with the following elements:
post_gamma
: nsave x n
samples from the posterior distribution
of the inclusion indicators
post_pi
: nsave
samples from the posterior distribution
of the inclusion probability
post_psi
: nsave
samples from the posterior distribution
of the prior precision
post_theta
: nsave
samples from the posterior distribution
of the regression coefficients
sigma_hat
: estimate of the latent data standard deviation
# Simulate some data:
y = c(rnorm(n = 100, mean = 0),
rnorm(n = 100, mean = 2))
# Fit the model:
fit = Gauss_sparse_means(y, nsave = 100, nburn = 100) # for a quick example
names(fit)
# Posterior inclusion probabilities:
pip = colMeans(fit$post_gamma)
plot(pip, y)
# Check the MCMC efficiency:
getEffSize(fit$post_theta) # coefficients
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.