Gibbs sampler for linear and Cox proportional hazards model under product non-local priors and Zellner's prior. Both sampling conditional on a model and Bayesian model averaging are implemented (see Details).

If x and y not specified samples from non-local priors/posteriors with density proportional to d(theta) N(theta; m, V) are produced, where d(theta) is the non-local penalty term.

1 2 |

`y` |
Vector with observed responses. When |

`x` |
Design matrix with all potential predictors |

`m` |
Mean for the Normal kernel |

`V` |
Covariance for the Normal kernel |

`msfit` |
Object of class |

`priorCoef` |
Prior distribution for the coefficients. Must be
object of class |

`priorVar` |
Prior on residual variance. Must be object of class |

`niter` |
Number of MCMC iterations |

`burnin` |
Number of burn-in MCMC iterations. Defaults to |

`thinning` |
MCMC thinning factor, i.e. only one out of each |

`pp` |
When |

The algorithm is implemented for product MOM (pMOM), product iMOM (piMOM) and product eMOM (peMOM) priors. The algorithm combines an orthogonalization that provides low serial correlation with a latent truncation representation that allows fast sampling.

When `y`

and `x`

are specified sampling is for the linear
regression posterior.
When argument `msfit`

is left missing, posterior sampling is for
the full model regressing `y`

on all covariates in `x`

.
When `msfit`

is specified each model is drawn with
probability given by `postProb(msfit)`

. In this case, a Bayesian
Model Averaging estimate of the regression coefficients can be
obtained by applying `colMeans`

to the `rnlp`

ouput matrix.

When `y`

and `x`

are left missing, sampling is from a
density proportional to d(theta) N(theta; m,V), where d(theta) is the
non-local penalty (e.g. d(theta)=prod(theta^(2r)) for the pMOM prior).

Matrix with posterior samples

David Rossell

D. Rossell and D. Telesca. Non-local priors for high-dimensional estimation, 2014. http://arxiv.org/pdf/1402.5107v2.pdf

`modelSelection`

to perform model selection and compute
posterior model probabilities.
For more details on prior specification see `msPriorSpec-class`

.

1 2 3 4 5 6 7 8 9 10 11 12 13 | ```
#Generate data
set.seed(2)
n <- 10^3; tau <- 0.133; x <- rmvnorm(n,sigma=matrix(c(2,1,1,2),nrow=2))
thtrue <- c(0.5,1); phitrue <- 1
y <- thtrue[1]*x[,1] + thtrue[2]*x[,2] + rnorm(n,sd=sqrt(phitrue))
#Specify prior parameters
priorCoef <- imomprior(tau=1)
priorVar <- igprior(alpha=.01,lambda=.01)
th <- rnlp(y=y, x=x, niter=100, priorCoef=priorCoef, priorVar=priorVar)
colMeans(th)
acf(th[,1])[1]
``` |

Questions? Problems? Suggestions? Tweet to @rdrrHQ or email at ian@mutexlabs.com.

All documentation is copyright its authors; we didn't write any of that.