Description Usage Arguments Details Value Author(s) References See Also Examples
This function performs posterior simulation (by MCMC) and spatial prediction in the Poisson spatial model (with link function from the Box-Cox class).
1 2 3 | pois.krige.bayes(geodata, coords = geodata$coords, data = geodata$data,
units.m = "default", locations = "no", borders,
model, prior, mcmc.input, output)
|
geodata |
a list containing elements |
coords |
an n x 2 matrix, each row containing Euclidean
coordinates of the n data locations. By default it takes the
element |
data |
a vector with data values. By default it takes the
element |
units.m |
n-dimensional vector of observation times for the data. By default ( |
locations |
an N x 2 matrix or data frame, or a list with the 2-D coordinates of the N prediction locations. |
borders |
optional. If a two column matrix defining a polygon is provided the prediction is performed only at locations inside this polygon. |
model |
a list defining the components of the model. It can take an output from |
prior |
specification of priors for the model parameters.
It can take an output from |
mcmc.input |
input parameter for the MCMC algorithm. It can take an output from |
output |
parameters for controlling the output. It can take an output from |
pois.krige.bayes
is a function for Bayesian geostatistical
inference in the Poisson spatial model.
It can be used for an analysis with fixed values of the parameters. However, the
function pois.krige
may be preferable in this case.
The Bayesian algorithm is using a discretized version of the prior distribution for the parameter phi. This means that the prior for phi is always a proper prior.
For simulating from the posterior distribution of S given y, we use a Langevin-Hastings type algorithm. This algorithm is a Metropolis-Hastings algorithm, where the proposal distribution uses gradient information from the posterior. The algorithm is described below. For shortness of presentation, we present only the MCMC algorithm for the case where beta follows a uniform prior and the link function is the canonical log-link.
When beta follows a uniform prior and the prior for sigma^2 is a scaled inverse-chi^2 distribution, the marginalised improper density of S is
f_I(s) propto |D^T*V^{-1}*D|^{-1/2}*|V|^{-1/2}*{n.sigma*S2.sigma + s^T*(V^{-1}-V^{-1}*D*(D^T*V^{-1}*D)^{-1}*D^T*V^{-1})*s }^{-(n-p+n.sigma)/2},
where V is the correlation matrix of S. The uniform prior for sigma^2 corresponds to S2.sigma=0 and n.sigma=-2, and the reciprocal prior for sigma^2 corresponds to S2.sigma=0 and n.sigma=0.
We use the reparametrisation S = Q*Gamma, where Q is the Cholesky factorisation of V so that V=QQ^T. Posterior simulations of S are obtained by transforming MCMC simulations from the conditional distribution of Gamma given Y=y.
The log posterior density of Gamma given Y=y is
log f(gamma|y) = const(y) - 1/2 * gamma^T*(I_n - V^{-1/2}*D*(D^T*V^{-1}*D)^{-1}*D^T*V^{-1/2})*gamma +∑_i y_i*s_i -exp(s_i),
where (s_1,...,s_n)^T = Q*gamma.
For the truncated Langevin-Hastings update we use a truncated form of the gradient (truncating by H_i) of the log target density,
nabla(gamma )^{trunc}= - (I_n - Q^{-1}*D*(D^T*V^{-1}*D)^{-1}*D^T*(Q^{-1})^T)*gamma + Q^T*{y_i-min{\exp(s_i),H_i} }_{i=1}^n .
The proposal gamma' follows a multivariate normal distribution with mean vector
xi(gamma)=gamma+(h/2)*nabla(gamma)^{trunc} and covariance matrix hI,
where h>0 is a user-specified “proposal variance” (called
S.scale
; see mcmc.control
).
When phi.prior
is not "fixed"
, we update the parameter phi by a random walk Metropolis step.
Here mcmc.input$phi.scale
(see mcmc.control
) is the
proposal variance, which needs to be sufficiently large so that the algorithm easily can move between the discrete values in prior$phi.discrete
(see prior.glm.control
).
CONTROL FUNCTIONS
The function call includes auxiliary control functions which allows
the user to specify and/or change the specification of 1) model
components
(using model.glm.control
), 2) prior
distributions (using prior.glm.control
), 3) options for the
MCMC algorithm (using mcmc.control
), and 4) options for the
output (using output.glm.control
).
Default values are available in most of the cases.
The arguments for the control functions are described in their
respective help files.
In the prediction part of the function we want to predict g_lambda^{-1}(S^*) at locations of interest, where g_lambda^{-1} is the inverse Box-Cox transformation. For the prediction part of the algorithm, we use the median of the predictive distribution as the predictor and 1/4 of the length of the 95 percent predictive interval as a measure of the prediction uncertainty. Below we describe the procedure for calculating these quantities.
First we perform a Bayesian Gaussian prediction with the given priors on beta
and sigma^2 on each of the simulated S-“datasets” from the
posterior distribution (and in case phi is not fixed, for each sampled phi value).
This Gaussian prediction is done by calling an internal
function which
is an extension of krige.bayes
allowing for more than one “data set”.
For calculating the probability below a threshold for the predictive distribution given the data, we first calculate this probability for each of the simulated S-“datasets”. This is done using the fact that the predictive distribution for each of the simulated S-“datasets” is a multivariate t-distribution. Afterwards the probability below a threshold is calculated by taking the empirical mean of these conditional probabilities.
Now the median and the 2.5 percent and 97.5 percent quantiles can be calculated by an iterative procedure, where first a guess of the value is made, and second this guess is checked by calculating the probability of being less than this value. In case the guess is too low, it is adjusted upwards, and vise verse.
A list with the following components:
posterior |
A list with results for the posterior distribution of the
model parameters and the random effects at the data locations. The components are:
|
predictive |
A list with results for the predictive distribution at the
prediction locations (if provided). The
components are:
|
model |
model components used as defined by |
prior |
priors used for the model parameters. |
mcmc.input |
input parameters used for the MCMC algorithm. |
.Random.seed |
system random seed before running the function.
Allows reproduction of results. If
the |
call |
the function call. |
Ole F. Christensen OleF.Christensen@agrsci.dk,
Paulo J. Ribeiro Jr. Paulo.Ribeiro@est.ufpr.br.
Further information about geoRglm can be found at:
http://gbi.agrsci.dk/~ofch/geoRglm.
pois.krige
for prediction with fixed parameters in the
Poisson normal model, binom.krige.bayes
for Bayesian prediction in the
Binomial-normal model, and krige.bayes
for
Bayesian prediction in the Gaussian spatial model.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 | data(p50)
if(!exists(".Random.seed", envir=.GlobalEnv, inherits = FALSE))
set.seed(1234)
## Not run:
## MCMC with fixed phi
prior.5 <- prior.glm.control(phi.prior = "fixed", phi = 0.1)
mcmc.5 <- mcmc.control(S.scale = 0.01, thin = 1)
test.5 <- pois.krige.bayes(p50, prior = prior.5, mcmc.input = mcmc.5)
par(mfrow=c(1,2))
hist(test.5)
## Now chose S.scale (Acc-rate=0.60 is preferable).
mcmc.5.new <- mcmc.control(S.scale = 0.08, thin = 100)
test.5.new <- pois.krige.bayes(p50,
locations = t(cbind(c(2.5,3.5),c(-6,3.5),c(2.5,-3.5),c(-6,-3.5))),
prior = prior.5, mcmc.input = mcmc.5.new,
output = list(threshold = 10, quantile = c(0.49999,0.99)))
image(test.5.new)
persp(test.5.new)
## MCMC with random phi.
## Note here that we can start with the S.scale from above.
mcmc.6 <- mcmc.control(S.scale = 0.08, n.iter = 2000, thin = 100,
phi.scale = 0.01)
prior.6 <- prior.glm.control(phi.discrete = seq(0.02, 1, 0.02))
test.6 <- pois.krige.bayes(p50, prior = prior.6, mcmc.input = mcmc.6)
## Acc-rate=0.60 , acc-rate-phi = 0.25-0.30 are preferable
mcmc.6.new <- mcmc.control(S.scale=0.08, n.iter = 400000, thin = 200,
burn.in = 5000, phi.scale = 0.12, phi.start = 0.5)
prior.6 <- prior.glm.control(phi.prior = "uniform",
phi.discrete = seq(0.02, 1, 0.02))
test.6.new <- pois.krige.bayes(p50,
locations = t(cbind(c(2.5,3.5), c(-60,-37))),
prior = prior.6, mcmc.input = mcmc.6.new)
par(mfrow=c(3,1))
hist(test.6.new)
## End(Not run)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.