groupBridge: Bayesian Group Bridge Regression

Description Usage Arguments Value References Examples

View source: R/groupBridge.R

Description

The Group Bayesian Bridge model of Mallick & Yi (2018). Bridge regression allows you to utilize different Lp norms for the shape of the prior through the shape parameter kappa of the power exponential distribution (also known as generalized Gaussian). Norms of 1 and 2 give the Laplace and Gaussian distributions respectively (corresponding to the LASSO and Ridge Regression). Norms smaller than 1 are very difficult to estimate directly, but have very tall modes at zero and very long, cauchy like tails. Values greater than 2 become increasingly platykurtic, with the uniform distribution arising as it approaches infinity.

Using kappa = 1 yields a Bayesian Group LASSO. The key difference from the Bayesian Group LASSO is that each group is here given its own lambda, faithful to the Mallick & Yi (2018) model. Hence, using this Bayesian Group Bridge with kappa = 1 can provide a more adaptive Bayesian Group LASSO.

JAGS has no built in power exponential distribution, so the distribution is parameterized as a uniform-gamma mixture just as in Mallick & Yi (2018). The parameterization is given below. For generalized linear models plug-in pseudovariances are used.

Model Specification:


Plugin Pseudo-Variances:

Usage

1
2
3
4
groupBridge(X, y, idx, family = "gaussian", kappa = 1.4,
  log_lik = FALSE, iter = 10000, warmup = 1000, adapt = 2000,
  chains = 4, thin = 1, method = "parallel", cl = makeCluster(2),
  ...)

Arguments

X

the model matrix. Construct this manually with model.matrix()[,-1]

y

the outcome variable

idx

the group labels. Should be of length = to ncol(model.matrix()[,-1]) with the group assignments for each covariate. Please ensure that you start numbering with 1, and not 0.

family

one of "gaussian", "binomial", or "poisson".

kappa

the Lp norm you wish to utilize. Default is 1.4.

log_lik

Should the log likelihood be monitored? The default is FALSE.

iter

How many post-warmup samples? Defaults to 10000.

warmup

How many warmup samples? Defaults to 1000.

adapt

How many adaptation steps? Defaults to 2000.

chains

How many chains? Defaults to 4.

thin

Thinning interval. Defaults to 1.

method

Defaults to "parallel". For an alternative parallel option, choose "rjparallel" or. Otherwise, "rjags" (single core run).

cl

Use parallel::makeCluster(# clusters) to specify clusters for the parallel methods. Defaults to two cores.

...

Other arguments to run.jags.

Value

a runjags object

References


Kyung, M., Gill, J., Ghosh, M., and Casella, G. (2010). Penalized regression, standard errors, and bayesian lassos. Bayesian Analysis, 5(2):369–411.

Mallick, H. & Yi, N. (2018) Bayesian bridge regression, Journal of Applied Statistics, 45:6, 988-1008, DOI: 10.1080/02664763.2017.1324565

Mallick, H., & Yi, N. (2014). A New Bayesian Lasso. Statistics and its interface, 7(4), 571–582. doi:10.4310/SII.2014.v7.n4.a12

Examples

1

abnormally-distributed/Bayezilla documentation built on Oct. 31, 2019, 1:57 a.m.