Description Usage Arguments Details Value References See Also Examples
Compute the posterior variance for the horseshoe for the normal means problem (i.e. linear regression with the design matrix equal to the identity matrix), for a fixed value of tau, without using MCMC. Details on computation are given in Carvalho et al. (2010) and Van der Pas et al. (2014).
1 | HS.post.var(y, tau, Sigma2)
|
y |
The data. An n*1 vector. |
tau |
Value for tau. Tau should be greater than 1/450. |
Sigma2 |
The variance of the data. |
The normal means model is:
y_i=β_i+ε_i, ε_i \sim N(0,σ^2)
And the horseshoe prior:
β_j \sim N(0,σ^2 λ_j^2 τ^2)
λ_j \sim Half-Cauchy(0,1).
If τ and σ^2 are known, the posterior variance can be computed without using MCMC.
The posterior variance for each of the datapoints.
Carvalho, C. M., Polson, N. G., and Scott, J. G. (2010), The horseshoe estimator for sparse signals. Biometrika 97(2), 465–480.
van der Pas, S. L., Kleijn, B. J. K., and van der Vaart, A. W. (2014), The horseshoe estimator: Posterior concentration around nearly black vectors. Electronic Journal of Statistics 8(2), 2585–2618.
HS.post.mean
to compute the posterior mean. See
HS.normal.means
for an implementation that does use MCMC, and
returns credible intervals as well as the posterior mean (and other quantities).
See horseshoe
for linear regression.
1 2 3 4 5 6 7 8 9 10 11 12 | #Plot the posterior variance for a range of deterministic values
y <- seq(-8, 8, 0.05)
plot(y, HS.post.var(y, tau = 0.05, Sigma2 = 1))
#Example with 20 signals, rest is noise
#Posterior variance for the signals is plotted in blue
#Posterior variance for the noise is plotted in black
truth <- c(rep(0, 80), rep(8, 20))
data <- truth + rnorm(100)
tau.example <- HS.MMLE(data, 1)
plot(data, HS.post.var(data, tau.example, 1),
col = c(rep("black", 80), rep("blue", 20)) )
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.