jags.parfit: Parallel computing with JAGS

View source: R/jags.parfit.R

jags.parfitR Documentation

Parallel computing with JAGS

Description

Does the same job as jags.fit, but parallel chains are run on parallel workers, thus computations can be faster (up to 1/n.chains) for long MCMC runs.

Usage

jags.parfit(cl, data, params, model, inits = NULL, n.chains = 3, ...)

Arguments

cl

A cluster object created by makeCluster, or an integer, see parDosa and evalParallelArgument.

data

A named list or environment containing the data. If an environment, data is coerced into a list.

params

Character vector of parameters to be sampled.

model

Character string (name of the model file), a function containing the model, or a or custommodel object (see Examples).

inits

Specification of initial values in the form of a list or a function, can be missing. Missing value setting can include RNG seed information, see Initialization at jags.model. If this is a function and using 'snow' type cluster as cl, the function must be self containing, i.e. not having references to R objects outside of the function, or the objects should be exported with clusterExport before calling jags.parfit. Forking type parallelism does not require such attention.

n.chains

Number of chains to generate, must be higher than 1. Ideally, this is equal to the number of parallel workers in the cluster.

...

Other arguments passed to jags.fit.

Details

Chains are run on parallel workers, and the results are combined in the end.

No update method is available for parallel mcmc.list objects. See parUpdate and related parallel functions (parJagsModel, parCodaSamples) for such purpose.

Additionally loaded JAGS modules (e.g. "glm", "lecuyer") need to be loaded to the workers when using 'snow' type cluster as cl argument. See Examples.

The use of the "lecuyer" module is recommended when running more than 4 chains. See Examples and parallel.inits.

Value

An mcmc.list object.

Author(s)

Peter Solymos

See Also

Sequential version: jags.fit

Function for stepwise modeling with JAGS: parJagsModel, parUpdate, parCodaSamples

Examples

## Not run: 
if (require(rjags)) {
set.seed(1234)
n <- 20
x <- runif(n, -1, 1)
X <- model.matrix(~x)
beta <- c(2, -1)
mu <- crossprod(t(X), beta)
Y <- rpois(n, exp(mu))
glm.model <- function() {
    for (i in 1:n) {
        Y[i] ~ dpois(lambda[i])
        log(lambda[i]) <- inprod(X[i,], beta[1,])
    }
    for (j in 1:np) {
        beta[1,j] ~ dnorm(0, 0.001)
    }
}
dat <- list(Y=Y, X=X, n=n, np=ncol(X))
load.module("glm")
m <- jags.fit(dat, "beta", glm.model)
cl <- makePSOCKcluster(3)
## load glm module
tmp <- clusterEvalQ(cl, library(dclone))
parLoadModule(cl, "glm")
pm <- jags.parfit(cl, dat, "beta", glm.model)
## chains are not identical -- this is good
pm[1:2,]
summary(pm)
## examples on how to use initial values
## fixed initial values
inits <- list(list(beta=matrix(c(0,1),1,2)),
    list(beta=matrix(c(1,0),1,2)),
    list(beta=matrix(c(0,0),1,2)))
pm2 <- jags.parfit(cl, dat, "beta", glm.model, inits)
## random numbers generated prior to jags.parfit
inits <- list(list(beta=matrix(rnorm(2),1,2)),
    list(beta=matrix(rnorm(2),1,2)),
    list(beta=matrix(rnorm(2),1,2)))
pm3 <- jags.parfit(cl, dat, "beta", glm.model, inits)
## self contained function
inits <- function() list(beta=matrix(rnorm(2),1,2))
pm4 <- jags.parfit(cl, dat, "beta", glm.model, inits)
## function pointing to the global environment
fun <- function() list(beta=matrix(rnorm(2),1,2))
inits <- function() fun()
clusterExport(cl, "fun")
## using the L'Ecuyer module with 6 chains
load.module("lecuyer")
parLoadModule(cl,"lecuyer")
pm5 <- jags.parfit(cl, dat, "beta", glm.model, inits,
    n.chains=6)
nchain(pm5)
unload.module("lecuyer")
parUnloadModule(cl,"lecuyer")
stopCluster(cl)
## multicore type forking
if (.Platform$OS.type != "windows") {
pm6 <- jags.parfit(3, dat, "beta", glm.model)
}
}

## End(Not run)

datacloning/dclone documentation built on Sept. 29, 2024, 3:21 p.m.