dist.Multivariate.Normal.Precision.Cholesky: Multivariate Normal Distribution: Precision-Cholesky...

Description Usage Arguments Details Value Author(s) See Also Examples

Description

These functions provide the density and random number generation for the multivariate normal distribution, given the precision-Cholesky parameterization.

Usage

1
2
dmvnpc(x, mu, U, log=FALSE) 
rmvnpc(n=1, mu, U)

Arguments

x

This is data or parameters in the form of a vector of length k or a matrix with k columns.

n

This is the number of random draws.

mu

This is mean vector mu with length k or matrix with k columns.

U

This is the k x k upper-triangular of the precision matrix that is Cholesky factor U of precision matrix Omega.

log

Logical. If log=TRUE, then the logarithm of the density is returned.

Details

The multivariate normal distribution, or multivariate Gaussian distribution, is a multidimensional extension of the one-dimensional or univariate normal (or Gaussian) distribution. It is usually parameterized with mean and a covariance matrix, or in Bayesian inference, with mean and a precision matrix, where the precision matrix is the matrix inverse of the covariance matrix. These functions provide the precision-Cholesky parameterization for convenience and familiarity. It is easier to calculate a multivariate normal density with the precision parameterization, because a matrix inversion can be avoided. The precision matrix is replaced with an upper-triangular k x k matrix that is Cholesky factor U, as per the chol function for Cholesky decomposition.

A random vector is considered to be multivariate normally distributed if every linear combination of its components has a univariate normal distribution. This distribution has a mean parameter vector mu of length k and a k x k precision matrix Omega, which must be positive-definite.

In practice, U is fully unconstrained for proposals when its diagonal is log-transformed. The diagonal is exponentiated after a proposal and before other calculations. Overall, Cholesky parameterization is faster than the traditional parameterization. Compared with dmvnp, dmvnpc must additionally matrix-multiply the Cholesky back to the covariance matrix, but it does not have to check for or correct the precision matrix to positive-definiteness, which overall is slower. Compared with rmvnp, rmvnpc is faster because the Cholesky decomposition has already been performed.

For models where the dependent variable, Y, is specified to be distributed multivariate normal given the model, the Mardia test (see plot.demonoid.ppc, plot.laplace.ppc, or plot.pmc.ppc) may be used to test the residuals.

Value

dmvnpc gives the density and rmvnpc generates random deviates.

Author(s)

Statisticat, LLC. software@bayesian-inference.com

See Also

chol, dmvn, dmvnc, dmvnp, dnorm, dnormp, dnormv, dwishartc, plot.demonoid.ppc, plot.laplace.ppc, and plot.pmc.ppc.

Examples

1
2
3
4
5
6
library(LaplacesDemon)
Omega <- diag(3)
U <- chol(Omega)
x <- dmvnpc(c(1,2,3), c(0,1,2), U)
X <- rmvnpc(1000, c(0,1,2), U)
joint.density.plot(X[,1], X[,2], color=TRUE)

LaplacesDemon documentation built on July 9, 2021, 5:07 p.m.