This package carries out Empirical Bayes thresholding using the methods developed by I. M. Johnstone and B. W. Silverman. The basic problem is to estimate a mean vector given a vector of observations of the mean vector plus white noise, taking advantage of possible sparsity in the mean vector. Within a Bayesian formulation, the elements of the mean vector are modelled as having, independently, a distribution that is a mixture of an atom of probability at zero and a suitable heavy-tailed distribution. The mixing parameter can be estimated by a marginal maximum likelihood approach. This leads to an adaptive thresholding approach on the original data. Extensions of the basic method, in particular to wavelet thresholding, are also implemented within the package.
|Author||Bernard W. Silverman|
|Date of publication||2012-10-29 08:57:00|
|Maintainer||Ludger Evers <email@example.com>|
|License||GPL (>= 2)|
beta.cauchy: Function beta for the quasi-Cauchy prior
beta.laplace: Function beta for the Laplace prior
ebayesthresh: Empirical Bayes thresholding on a sequence
ebayesthresh.wavelet: Empirical Bayes thresholding on the levels of a wavelet...
isotone: Weighted least squares monotone regression
postmean: Posterior mean estimator
postmed: Posterior median estimator
tfromw: Find threshold from mixing weight
tfromx: Find threshold from data
threshld: Threshold data with hard or soft thresholding
vecbinsolv: Solve systems of nonlinear equations based on a monotonic...
wandafromx: Find weight and scale factor from data if Laplace prior is...
wfromt: Mixing weight from posterior median threshold
wfromx: Find Empirical Bayes weight from data
wmonfromx: Find monotone Empirical Bayes weights from data
zetafromx: Estimation of a parameter in the prior weight sequence in the...