Description Usage Arguments Author(s) References Examples

This function implements the idea of Bayesian Lasso quantile regression using a likelihood function that is based on the asymmetric Laplace distribution (Rahim, 2016). The asymmetric Laplace error distribution is written as a scale mixture of normal distributions as in Reed and Yu (2009). This function implements the Bayesian lasso for linear quantile regression models by assigning scale mixture of normal (SMN) priors on the parameters and independent exponential priors on their variances. We propose an alternative Bayesian analysis of the Bayesian lasso problem reported in Li, et al. (2010). A Gibbs sampling algorithm for the Bayesian Lasso quantile regression is constructed by sampling the parameters from their full conditional distributions.

1 | ```
BLqr(x,y, tau = 0.5, runs = 11000, burn = 1000, thin=1)
``` |

`x` |
Matrix of predictors. |

`y` |
Vector of dependent variable. |

`tau` |
The quantile of interest. Must be between 0 and 1. |

`runs` |
Length of desired Gibbs sampler output. |

`burn` |
Number of Gibbs sampler iterations before output is saved. |

`thin` |
thinning parameter of MCMC draws. |

Rahim Alhamzawi

[1] Alhamzawi, R. (2016). Bayesian variable selection in quantile regression using asymmetric Laplace distribution. Working paper.

[2] Reed, C. and Yu, K. (2009). A partially collapsed Gibbs sampler for Bayesian quantile regression. *Technical Report*. Department of Mathematical Sciences, Brunel
University. URL: http://bura.brunel.ac.uk/bitstream/2438/3593/1/fulltext.pdf.

[3] Li, Q., Xi, R. and Lin, N. (2010). Bayesian regularized quantile regression. Bayesian Analysis, 5(3): 533-56.

1 2 3 4 5 6 7 8 9 10 11 12 |

Embedding an R snippet on your website

Add the following code to your website.

For more information on customizing the embed code, read Embedding Snippets.