Description Usage Arguments Details Value Normal-Inverse-Wishart model Inverted Wishart parametrization (Press) References See Also Examples
View source: R/samesource_cpp.R
This is the numerator of the Bayes factor: assume that all observations come from the same source. Implemented in C.
1 2 3 4 5 6 7 8 9 10 11 12 13 | marginalLikelihood(
X,
n.iter,
B.inv,
W.inv,
U,
nw,
mu,
burn.in,
output.mcmc = FALSE,
verbose = FALSE,
Gibbs_only = FALSE
)
|
X |
the observation matrix ((n x p): n = observation, p = variables) @param U covariance matrix for the mean (p x p) |
n.iter |
number of MCMC iterations excluding burn-in |
B.inv |
prior inverse of between-source covariance matrix |
W.inv |
initialization for prior inverse of within-source covariance matrix |
nw |
degrees of freedom |
mu |
prior mean (p x 1) |
burn.in |
number of MCMC burn-in iterations |
output.mcmc |
if TRUE output the entire chain as a coda object, else just return the log-ml value |
verbose |
if TRUE, be verbose |
Gibbs_only |
if TRUE, only return the Gibbs posterior samples. Implies |
See diwishart_inverse
for the parametrization of the Inverted Wishart.
See marginalLikelihood_internal
for further documentation.
the log-marginal likelihood value, or a list:
value
: the log-ml value
mcmc
: a coda
object with the posterior samples
Described in \insertCiteBozza2008Probabilisticbayessource.
Observation level:
X_{ij} ~ N_p(theta_i, W_i)
(i = source, j = items from source)
Group level:
theta_i ~ N_p(μ, B)
W_i ~ IW_p(n_w, U)
Hyperparameters:
B, U, n_w, μ
Posterior samples of theta, W^{(-1)} can be generated with a Gibbs sampler.
Uses \insertCitePress2012Appliedbayessource parametrization.
X ~ IW(v, S)
with S is a p x p matrix, v > 2p (the degrees of freedom).
Then:
E[X] = S/(n - 2(p + 1))
Other core functions:
bayessource-package
,
get_minimum_nw_IW()
,
make_priors_and_init()
,
marginalLikelihood_internal()
,
mcmc_postproc()
,
samesource_C()
,
two.level.multivariate.calculate.UC()
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 | # Use the iris data
df_X <- iris[, c(1,3)]
X <- as.matrix(df_X)
p <- ncol(X)
# Observations
plot(X)
# True species
plot(X, col = iris$Species)
# Priors
B.inv <- diag(p)
W.inv <- diag(p)
U <- diag(p)
mu <- colMeans(X)
# d.o.f.
nw <- get_minimum_nw_IW(p)
# Computational parameters
n_iter <- 10000
n_burn_in <- 1000
# Compute the marginal likelihood
marginalLikelihood(
X = X,
n.iter = n_iter,
B.inv = B.inv,
W.inv = W.inv,
U = U,
nw = nw,
burn.in = n_burn_in,
mu = mu
)
# Diagnostics ------------
# Retain the full Gibbs output
list_mcmc <- marginalLikelihood(
X = X,
n.iter = 10000,
B.inv = B.inv,
W.inv = W.inv,
U = U,
nw = get_minimum_nw_IW(p),
burn.in = 1000,
mu = mu,
output.mcmc = TRUE
)
# The log-ML value
list_mcmc$value
# The full Gibbs chain output
library(coda)
head(list_mcmc$mcmc, 20)
# Diagnostics plots by coda
plot(list_mcmc$mcmc)
coda::autocorr.plot(list_mcmc$mcmc)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.