Description Usage Arguments Details Value Author(s) References See Also Examples

View source: R/post_equi_mcmc.R

Given the output of `equi_mcmc`

, this function will calculate the Bayes
rule under multiway Stein's loss.

1 | ```
get_equi_bayes(psi_inv, sigma, burnin = NULL)
``` |

`psi_inv` |
A list of arrays where |

`sigma` |
A vector of posteior draws of the total variation parameter.
This is just |

`burnin` |
A numeric between 0 and 1. What proportion of the posterior samples do you want to discard as burnin? The default is 0.25. |

Multiway Stein's loss is a generalization of Stein's loss to more than two
dimensions. The Bayes rule under this loss is simply represented in terms of
the posterior moments of the component precision matrices. These moments can
be approximated by using the output of `equi_mcmc`

. When using the
invariant prior that is used in `equi_mcmc`

, the resulting Bayes rule is
the uniformly minimum risk equivariant estimator.

More details on multiway Stein's loss and the Bayes rules under it can be found in Gerard and Hoff (2015).

`Sig_hat`

A list of the Bayes rules of the component covariance
matrices under multiway Stein's loss.

`B`

A list of the lower-triangular Cholesky square roots of the Bayes
rules of the component covariance matrices under multiway Stein's loss. We
have that `Sig_hat[[i]]`

is equal to `B[[i]] %*% t(B[[i]])`

.

`b`

A numeric. This is the bayes rule of the total variation
parameter. This is the 'standard deviation' version. That is, the ```
b ^
2
```

would be used to calculate the overall covariance matrix.

David Gerard.

Gerard, D., & Hoff, P. (2015). Equivariant minimax
dominators of the MLE in the array normal model.
*Journal of Multivariate Analysis*, 137, 32-49.
https://doi.org/10.1016/j.jmva.2015.01.020
http://arxiv.org/pdf/1408.0424.pdf

1 2 3 4 5 6 7 |

dcgerard/tensr documentation built on Aug. 16, 2018, 9:56 a.m.

Embedding an R snippet on your website

Add the following code to your website.

For more information on customizing the embed code, read Embedding Snippets.