Description Usage Arguments Details Value Author(s) See Also Examples
This help page contains functions for calculating the Monte Carlo likelihoods and variances for generalised linear models (Binomial and Poisson) with latent CAR variables.
mcl.prep.glm
simulates the Monte Carlo samples from the
importance sampling distribution to be used in the Monte Carlo
likelihood functions. The samples are from an MCMC Chain generated by
the Metropolis-Hastings Algorithms. Details see postZ
.
mcl.glm
calculates the Monte Carlo log-likelihood ratio for a given parameter
value to the importance sampler value.
mcl.profile.glm
calculates the Monte Carlo profile
log-likelihood ratio of rho and sigma.
vmle.glm
calculates the variance at the Monte Carlo MLE.
get.beta.glm
finds the linear coefficients of the glm with
given CAR parameters by solving the Monte Carlo gradient with nleqslv
.
1 2 3 4 5 6 7 8 9 | mcl.prep.glm(data, family, psi, Z.start = NULL, mcmc.control=list(),
pilot.run = TRUE, pilot.plot = FALSE, plot.diag = FALSE)
mcl.glm(pars, mcdata, family, Evar = FALSE)
mcl.profile.glm(pars, beta0, mcdata, family, Evar = FALSE)
vmle.glm(MLE, mcdata, family)
get.beta.glm(beta0, rho.sig, mcdata, family)
|
data |
a list object given by |
family |
a character equal to either "binom" for Binomial variables or "poisson" for Poisson variables |
psi |
the value of the importance sampler parameters |
Z.start |
initial value for simulating the MCMC algorithm |
mcmc.control |
a list of parameter that control the MCMC
algorithm, see more in the Details of |
pilot.run |
if TRUE, a pilot run is done for tuning the MCMC parameters |
pilot.plot |
if TRUE, the MCMC diagnostic plots are plotted for the pilot run |
plot.diag |
if TRUE, the MCMC diagnostic plots are plotted for the simulated chain |
pars |
the parameter value of the likelihood function to be evaluated at |
mcdata |
a list of data generated by |
Evar |
if TRUE return the estimated variance of the Monte Carlo likelihood |
MLE |
a list object with
|
beta0 |
starting point for finding the beta |
rho.sig |
values of rho and sigma for the CAR covariance matrix |
The Monte Carlo samples can be simulated by different MCMC
algorithms described in postZ
. The log determinant is
calculated directly in the evaluation of the Monte Carlo likelihood
and the Monte Carlo approximation is only used to approximate the
integral for the latent variables.
Given the CAR covariance matrix, the betas are found by solving the
non-linear system given by letting the gradient of the Monte Carlo
likelihood with respect to beta equal to zero. Due to the Monte
Carlo error, the nleqslv
use only a maximum of two iteration to find a
local solution.
mcl.prep.glm
returns a list object that contains all the output
of CAR.simGLM
plus the following following elements:
Zy
, a matrix of the Monte Carlo samples,
lHZy.psi
, a list of statistics pre-calculate for the importance
distribution,
mcmc.pars, a list of the mcmc tuning parameter used in the simulation.
mcl.glm
and mcl.proifle.glm
return a numeric value of the Monte Carlo likelihood
when Evar = FALSE
; if TRUE
it returns an array of the
Monte Carlo likelihood and the corresponding estimated variance.
vmle.glm
returns the estimated covariance matrix of the Monte
Carlo MLE.
get.beta.glm
returns an object given by nleqslv
.
Zhe Sha zhesha1006@gmail.com
postZ
, mcl.dCAR
, OptimMCL
, rsmMCL
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 | ## Simulate some data to work with
set.seed(33)
n.torus <- 10
nb <- 30
rho <- 0.2
sigma <- 1.5
beta <- c(1, 1)
pars.true <- c(rho, sigma, beta)
X0 <- cbind(rep(1, n.torus^2), sample(log(1:n.torus^2)/5))
mydata2 <- CAR.simGLM(method = "binom", n = c(n.torus, n.torus), pars = pars.true,
Xs = as.matrix(X0), n.trial = nb)
## Prepare the Monte Carlo samples
## Find a suitable initial value for the importance sampler parameter, e.g:
library(spatialreg)
library(spdep)
data.glm <- data.frame(y=mydata2$y, mydata2$covX[,-1])
fit.glm <- glm(cbind(y, nb-mydata2$y) ~ .,data = data.glm, family=binomial)
logitp <- log((mydata2$y+0.5)/(mydata2$n.trial - mydata2$y + 0.5))
data.splm <- data.frame(y=logitp, mydata2$covX[,-1])
listW <- mat2listw(mydata2$W)
fit.splm <- spautolm(y~., data = data.splm, listw=listW, family = "CAR")
psi.binom <- c(fit.splm$lambda, fit.splm$fit$s2, coef(fit.glm))
## Set the parameters for the MCMC algorithm
mc.pars <- list(N.YZ =1e3, N.Zy = 1e3, Scale = 1.65/(n.torus^(2/6)), thin = 5,
burns = 5e2, method = "mala", scale.fixed = TRUE)
mc.data2 <- mcl.prep.glm(data = mydata2, family = "binom",
psi = psi.binom, mcmc.control = mc.pars,
pilot.plot = TRUE, plot.diag = TRUE)
## Calculate the Monte Carlo likelihoods
pars.t <- c(rho, sigma, beta)
mcl.glm(pars.t, family = "binom", mcdata = mc.data2, Evar = TRUE)
## Do a direct optimization of the Monte Carlo likelihood function to find the MLE
## Not run:
library(maxLik)
A <- matrix(0, nrow = 3, ncol = length(pars.t))
A[,1] <- c(1, -1, 0)
A[,2] <- c(0, 0, 1)
B <- c(0.24, 0.24, -0.1)
mle.bin <- maxBFGS(fn = mcl.glm, start=as.numeric(psi.binom), print.level=1,
constraints=list(ineqA=A, ineqB=B),
mcdata = mc.data2, family = "binom")
## End(Not run)
## Assume we have obtained mle.bin
mle.bin <- list(estimate = c(0.2271854 , 1.6117040, -0.3870856, 2.6525503),
hessian = matrix(c( -1981.34398, -71.992190, -79.910745, -63.451022,
-71.99219, -13.985701, 2.965628, 2.216893,
-79.91074, 2.965628, -251.320742, -176.993087,
-63.45102, 2.216893, -176.993087, -132.575284), 4,4))
## Calculate the variance of the MC-MLE
v.mle.bin <- vmle.glm(mle.bin, mcdata = mc.data2, family = "binom")
## Find the Monte Carlo MLE of beta, given the value of rho and sigma
get.beta.glm(beta0 = psi.binom[-c(1,2)], rho.sig = psi.binom[1:2],
mcdata = mc.data2, family = c("binom"))
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.