Description Usage Arguments Details Value References See Also Examples
These functions provide the density and random number generation for the Wishart distribution with the Cholesky parameterization.
1 2 |
U |
This is the upper-triangular k x k matrix for the Cholesky factor U of precision matrix Omega. |
nu |
This is the scalar degrees of freedom nu. |
S |
This is the symmetric, positive-semidefinite, k x k scale matrix S. |
log |
Logical. If |
Application: Continuous Multivariate
Density: p(theta) = (2^(nu*k/2) * pi^(k(k-1)/4) * [Gamma((nu+1-i)/2) * ... * Gamma((nu+1-k)/2)])^(-1) * |S|^(-nu/2) * |Omega|^((nu-k-1)/2) * exp(-(1/2) * tr(S^(-1) Omega))
Inventor: John Wishart (1928)
Notation 1: Omega ~ W[nu](S)
Notation 2: p(Omega) = W[nu](Omega | S)
Parameter 1: degrees of freedom nu >= k
Parameter 2: symmetric, positive-semidefinite k x k scale matrix S
Mean: E(Omega) = nuS
Variance: var(Omega) = nu(S[i,j]^2 + S[i,i]S[j,j])
Mode: mode(Omega) = (nu-k-1)S, for nu >= k + 1
The Wishart distribution is a generalization to multiple dimensions of the chi-square distribution, or, in the case of non-integer degrees of freedom, of the gamma distribution. However, the Wishart distribution is not called the multivariate chi-squared distribution because the marginal distribution of the off-diagonal elements is not chi-squared.
The Wishart is the conjugate prior distribution for the precision matrix
Omega, the inverse of which (covariance matrix
Sigma) is used in a multivariate normal distribution. In
this parameterization, Omega has been decomposed to the
upper-triangular Cholesky factor U, as per
chol
.
The integral is finite when nu >= k, where nu is the scalar degrees of freedom parameter, and k is the dimension of scale matrix S. The density is finite is finite when nu >= k + 1, which is recommended.
The degrees of freedom, nu, is equivalent to specifying a prior sample size, indicating the confidence in S, where S is a prior guess at the order of covariance matrix Sigma. A flat prior distribution is obtained as nu -> 0.
In practice, U is fully unconstrained for proposals when its diagonal is log-transformed. The diagonal is exponentiated after a proposal and before other calculations. The Cholesky parameterization is faster than the traditional parameterization.
The Wishart prior lacks flexibility, having only one parameter, nu, to control the variability for all k(k + 1)/2 elements. Popular choices for the scale matrix S include an identity matrix or sample covariance matrix. When the model sample size is small, the specification of the scale matrix can be influential.
One of many alternatives is to use hierarchical priors,
in which the main diagonal of the (identity) scale matrix and the
degrees of freedom are treated as unknowns (Bouriga and Feron, 2011;
Daniels and Kass, 1999). A hierarchical Wishart prior provides
shrinkage toward diagonality. Another alternative is to abandon the
Wishart distribution altogether for the more flexible method of Barnard
et al. (2000) or the horseshoe distribution (dhs
)
for sparse covariance matrices.
dwishartc
gives the density and
rwishartc
generates random deviates.
Barnard, J., McCulloch, R., and Meng, X. (2000). "Modeling Covariance Matrices in Terms of Standard Deviations and Correlations, with Application to Shrinkage". Statistica Sinica, 10, p. 1281–1311.
Bouriga, M. and Feron, O. (2011). "Estimation of Covariance Matrices Based on Hierarchical Inverse-Wishart Priors". URL: http://www.citebase.org/abstract?id=oai:arXiv.org:1106.3203.
Daniels, M., and Kass, R. (1999). "Nonconjugate Bayesian Estimation of Covariance Matrices and its use in Hierarchical Models". Journal of the American Statistical Association, 94(448), p. 1254–1263.
Wishart, J. (1928). "The Generalised Product Moment Distribution in Samples from a Normal Multivariate Population". Biometrika, 20A(1-2), p. 32–52.
chol
,
dchisq
,
dgamma
,
dhs
,
dinvwishart
,
dinvwishartc
,
dmvnp
,
dmvnpc
, and
Prec2Cov
.
1 2 3 4 5 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.