MCMCoprobit  R Documentation 
This function generates a sample from the posterior distribution of an ordered probit regression model using the data augmentation approach of Albert and Chib (1993), with cutpoints sampled according to Cowles (1996) or Albert and Chib (2001). The user supplies data and priors, and a sample from the posterior distribution is returned as an mcmc object, which can be subsequently analyzed with functions provided in the coda package.
MCMCoprobit( formula, data = parent.frame(), burnin = 1000, mcmc = 10000, thin = 1, tune = NA, tdf = 1, verbose = 0, seed = NA, beta.start = NA, b0 = 0, B0 = 0, a0 = 0, A0 = 0, mcmc.method = c("Cowles", "AC"), ... )
formula 
Model formula. 
data 
Data frame. 
burnin 
The number of burnin iterations for the sampler. 
mcmc 
The number of MCMC iterations for the sampler. 
thin 
The thinning interval used in the simulation. The number of Gibbs iterations must be divisible by this value. 
tune 
The tuning parameter for the MetropolisHastings step. Default of NA corresponds to a choice of 0.05 divided by the number of categories in the response variable. 
tdf 
Degrees of freedom for the multivariatet proposal distribution
when 
verbose 
A switch which determines whether or not the progress of the
sampler is printed to the screen. If 
seed 
The seed for the random number generator. If NA, the Mersenne
Twister generator is used with default seed 12345; if an integer is passed
it is used to seed the Mersenne twister. The user can also pass a list of
length two to use the L'Ecuyer random number generator, which is suitable
for parallel computation. The first element of the list is the L'Ecuyer
seed, which is a vector of length six or NA (if NA a default seed of

beta.start 
The starting value for the β vector. This can either be a scalar or a column vector with dimension equal to the number of betas. If this takes a scalar value, then that value will serve as the starting value for all of the betas. The default value of NA will use rescaled estimates from an ordered logit model. 
b0 
The prior mean of β. This can either be a scalar or a column vector with dimension equal to the number of betas. If this takes a scalar value, then that value will serve as the prior mean for all of the betas. 
B0 
The prior precision of β. This can either be a scalar or a square matrix with dimensions equal to the number of betas. If this takes a scalar value, then that value times an identity matrix serves as the prior precision of β. Default value of 0 is equivalent to an improper uniform prior on β. 
a0 
The prior mean of γ. This can either be a scalar or a column vector with dimension equal to the number of betas. If this takes a scalar value, then that value will serve as the prior mean for all of the betas. 
A0 
The prior precision of γ. This can either be a scalar or a square matrix with dimensions equal to the number of betas. If this takes a scalar value, then that value times an identity matrix serves as the prior precision of γ. Default value of 0 is equivalent to an improper uniform prior on γ. 
mcmc.method 
Can be set to either "Cowles" (default) or "AC" to perform posterior sampling of cutpoints based on Cowles (1996) or Albert and Chib (2001) respectively. 
... 
further arguments to be passed 
MCMCoprobit
simulates from the posterior distribution of a ordered
probit regression model using data augmentation. The simulation proper is
done in compiled C++ code to maximize efficiency. Please consult the coda
documentation for a comprehensive list of functions that can be used to
analyze the posterior sample.
The observed variable y_i is ordinal with a total of C categories, with distribution governed by a latent variable:
z_i = x_i'β + \varepsilon_i
The errors are assumed to be from a standard Normal distribution. The probabilities of observing each outcome is governed by this latent variable and C1 estimable cutpoints, which are denoted γ_c. The probability that individual i is in category c is computed by:
π_{ic} = Φ(γ_c  x_i'β)  Φ(γ_{c1}  x_i'β)
These probabilities are used to form the multinomial distribution that defines the likelihoods.
MCMCoprobit
provides two ways to sample the cutpoints. Cowles (1996)
proposes a sampling scheme that groups sampling of a latent variable with
cutpoints. In this case, for identification the first element
γ_1 is normalized to zero. Albert and Chib (2001) show
that we can sample cutpoints indirectly without constraints by transforming
cutpoints into realvalued parameters (α).
An mcmc object that contains the posterior sample. This object can be summarized by functions provided by the coda package.
Albert, J. H. and S. Chib. 1993. “Bayesian Analysis of Binary and Polychotomous Response Data.” J. Amer. Statist. Assoc. 88, 669679
M. K. Cowles. 1996. “Accelerating Monte Carlo Markov Chain Convergence for Cumulativelink Generalized Linear Models." Statistics and Computing. 6: 101110.
Andrew D. Martin, Kevin M. Quinn, and Jong Hee Park. 2011. “MCMCpack: Markov Chain Monte Carlo in R.”, Journal of Statistical Software. 42(9): 121. doi: 10.18637/jss.v042.i09.
Valen E. Johnson and James H. Albert. 1999. Ordinal Data Modeling. Springer: New York.
Albert, James and Siddhartha Chib. 2001. “Sequential Ordinal Modeling with Applications to Survival Data." Biometrics. 57: 829836.
Daniel Pemstein, Kevin M. Quinn, and Andrew D. Martin. 2007. Scythe Statistical Library 1.0. http://scythe.lsa.umich.edu.
Martyn Plummer, Nicky Best, Kate Cowles, and Karen Vines. 2006. “Output Analysis and Diagnostics for MCMC (CODA)”, R News. 6(1): 711. https://CRAN.Rproject.org/doc/Rnews/Rnews_20061.pdf.
plot.mcmc
,summary.mcmc
## Not run: x1 < rnorm(100); x2 < rnorm(100); z < 1.0 + x1*0.1  x2*0.5 + rnorm(100); y < z; y[z < 0] < 0; y[z >= 0 & z < 1] < 1; y[z >= 1 & z < 1.5] < 2; y[z >= 1.5] < 3; out1 < MCMCoprobit(y ~ x1 + x2, tune=0.3) out2 < MCMCoprobit(y ~ x1 + x2, tune=0.3, tdf=3, verbose=1000, mcmc.method="AC") summary(out1) summary(out2) plot(out1) plot(out2) ## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.