Description Usage Arguments Details Value Author(s) References Examples
View source: R/gibbsPLMIX_with_norm.R
Perform Gibbs sampling simulation for a Bayesian mixture of Plackett-Luce models fitted to partial orderings. Differently from the gibbsPLMIX
function it contains a normalization step for the support parameters which should be proven to be equivalent to the case where the normalization step is skipped and the support parameter space is unidentified up to a proportionality constant.
1 2 3 4 5 6 7 8 9 10 11 |
pi_inv |
An object of class |
K |
Number of possible items. |
G |
Number of mixture components. |
init |
List of named objects with initialization values: |
n_iter |
Total number of MCMC iterations. |
n_burn |
Number of initial burn-in drawings removed from the returned MCMC sample. |
hyper |
List of named objects with hyperparameter values for the conjugate prior specification: |
centered_start |
Logical: whether a random start whose support parameters and weights should be centered around the observed relative frequency that each item has been ranked top. Default is |
The size L of the final MCMC sample is equal to n_iter
-n_burn
.
A list of S3 class gsPLMIX
with named elements:
|
Numeric LxG matrix with MCMC samples of the mixture weights. |
|
Numeric Lx(G*K) matrix with MCMC samples of the component-specific support parameters. |
|
Numeric vector of L posterior log-likelihood values. |
|
Numeric vector of L posterior deviance values (-2 * |
|
Numeric vector of L objective function values (that is the kernel of the log-posterior distribution). |
|
The matched call. |
Cristina Mollica and Luca Tardella
Mollica, C. and Tardella, L. (2017). Bayesian Plackett-Luce mixture models for partially ranked data. Psychometrika, 82(2), pages 442–458, ISSN: 0033-3123, DOI: 10.1007/s11336-016-9530-0.
1 2 3 4 5 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.