Description Usage Arguments Details Value Author(s) References See Also
The SIR
function performs Sampling Importance Resampling, also
called Sequential Importance Resampling, and uses a multivariate normal
proposal density.
1 | SIR(Model, Data, mu, Sigma, n=1000, CPUs=1, Type="PSOCK")
|
Model |
This is a model specification function. For more
information, see |
Data |
This is a list of data. For more information, see
|
mu |
This is a mean vector, mu, for a multivariate
normal distribution, and is usually the posterior means from an
object of class |
Sigma |
This is a covariance matrix, Sigma, for a
multivariate normal distribution, and is usually the |
n |
This is the number of samples to be drawn from the posterior distribution. |
CPUs |
This argument accepts an integer that specifies the number
of central processing units (CPUs) of the multicore computer or
computer cluster. This argument defaults to |
Type |
This argument specifies the type of parallel processing to
perform, accepting either |
Sampling Importance Resampling (SIR) was introduced in Gordon, et al. (1993), and is the original particle filtering algorithm (and this family of algorithms is also known as Sequential Monte Carlo). A distribution is approximated with importance weights, which are approximations to the relative posterior densities of the particles, and the sum of the weights is one. In this terminology, each sample in the distribution is a “particle”. SIR is a sequential or recursive form of importance sampling. As in importance sampling, the expectation of a function can be approximated as a weighted average. The optimal proposal distribution is the target distribution.
In the LaplacesDemon
package, the main use of the SIR
function is to produce posterior samples for iterative quadrature,
Laplace Approximation, or Variational Bayes, and SIR
is called
behind-the-scenes by the IterativeQuadrature
,
LaplaceApproximation
, or VariationalBayes
function.
Iterative quadrature estimates the posterior mean and the associated
covariance matrix. Assuming normality, this output characterizes the
marginal posterior distributions. However, it is often useful to have
posterior samples, in which case the SIR
function is used to
draw samples. The number of samples, n
, should increase with
the number and intercorrelations of the parameters. Otherwise,
multimodal posterior distributions may occur.
Laplace Approximation estimates the posterior mode and the associated
covariance matrix. Assuming normality, this output characterizes the
marginal posterior distributions. However, it is often useful to have
posterior samples, in which case the SIR
function is used to
draw samples. The number of samples, n
, should increase with
the number and intercorrelations of the parameters. Otherwise,
multimodal posterior distributions may occur.
Variational Bayes estimates both the posterior mean and
variance. Assuming normality, this output characterizes the marginal
posterior distributions. However, it is often useful to have posterior
samples, in which case the SIR
function is used to draw
samples. The number of samples, n
, should increase with the
number of intercorrelations of the parameters. Otherwise, multimodal
posterior distributions may occur.
SIR is also commonly used when considering a mild change in a prior
distribution. For example, suppose a model was updated in
LaplacesDemon
, and it had a least-informative prior
distribution, but the statistician would like to estimate the impact
of changing to a weakly-informative prior distribution. The change is
made in the model specification function, and the posterior means and
covariance are supplied to the SIR
function. The returned
samples are estimates of the posterior, given the different prior
distribution. This is akin to sensitivity analysis (see the
SensitivityAnalysis
function).
In other contexts (for which this function is not designed), SIR is used with dynamic linear models (DLMs) and state-space models (SSMs) for state filtering.
Parallel processing may be performed when the user specifies
CPUs
to be greater than one, implying that the specified number
of CPUs exists and is available. Parallelization may be performed on a
multicore computer or a computer cluster. Either a Simple Network of
Workstations (SNOW) or Message Passing Interface (MPI) is used. With
small data sets and few samples, parallel processing may be slower,
due to computer network communication. With larger data sets and more
samples, the user should experience a faster run-time.
This function was adapted from the sir
function in the
LearnBayes
package.
The SIR
function returns a matrix of samples drawn from the
posterior distribution.
Statisticat, LLC. software@bayesian-inference.com
Gordon, N.J., Salmond, D.J., and Smith, A.F.M. (1993). "Novel Approach to Nonlinear/Non-Gaussian Bayesian State Estimation". IEEE Proceedings F on Radar and Signal Processing, 140(2), p. 107–113.
dmvn
,
IterativeQuadrature
,
LaplaceApproximation
,
LaplacesDemon
,
SensitivityAnalysis
, and
VariationalBayes
.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.