Description Usage Arguments Details Value Note Author(s) References See Also Examples
Simulate the increasing random walk associated with a real-valued continuous random variable.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
dimension |
dimension of the input space. |
lsf |
limit state function. |
N |
number of particules. |
q |
level until which the randow walk is to be generated. |
Nevent |
the number of desired events. |
X |
to start with some given particles. |
y |
value of the |
K |
kernel transition for conditional generations. |
burnin |
burnin parameter. |
sigma |
radius parameter for |
last.return |
if the last event should be returned. |
use.potential |
tu use a ‘potential’ matrix to select starting point not directly related to the sample to be moved with the MH algorithm. |
plot |
if |
plot.lsf |
a boolean indicating if the |
print_plot |
if TRUE, print the updated plot after each iteration. This might
be slow; use with a small |
output_dir |
if plots are to be saved in pdf in a given directory. This will
be pasted with ‘_IRW.pdf’. Together with |
plot.lab |
the x and y labels for the plot |
This function lets generate the increasing random walk associated with a continous
real-valued random variable of the form Y = lsf(X) where X is
vectorial random variable.
This random walk can be associated with a Poisson process with parameter
N and hence the number of iterations before a given threshold q
is directly related to P[ lsf(X) > q]. It is the core tool of algorithms
such as nested sampling, Last Particle Algorithm or Tootsie Pop Algorithm.
Bascially for N = 1, it generates a sample Y = lsf(X) and iteratively
regenerates greater than the found value: Y_{n+1} \sim μ^Y( \cdot \mid Y > Y_n. This
regeneration step is done with a Metropolis-Hastings algorithm and that is why it is usefull
to consider generating several chains all together (N > 1).
The algorithm stops when it has simulated the required number of events Nevent or when
it has reached the sought threshold q.
An object of class list containing the following data:
L |
the events of the random walk. |
M |
the total number of iterations. |
Ncall |
the total number of calls to the |
X |
a matrix containing the final particles. |
y |
the value of |
q |
the threshold considered when generating the random walk. |
Nevent |
the target number of events when generating the random walk. |
Nwmoves |
the number of rejected transitions, ie when the proposed point was not stricly greater/lower than the current state. |
acceptance |
a vector containing the acceptance rate for each use of the MH algorithm. |
Problem is supposed to be defined in the standard space. If not,
use UtoX to do so. Furthermore, each time a set of vector
is defined as a matrix, ‘nrow’ = dimension and
‘ncol’ = number of vector to be consistent with as.matrix
transformation of a vector.
Algorithm calls lsf(X) (where X is a matrix as defined previously) and expects a vector in return. This allows the user to optimise the computation of a batch of points, either by vectorial computation, or by the use of external codes (optimised C or C++ codes for example) and/or parallel computation; see examples in MonteCarlo.
Clement WALTER clementwalter@icloud.com
C. Walter:
Moving Particles: a parallel optimal Multilevel Splitting method
with application in quantiles estimation and meta-model based algorithms
Structural Safety, 55, 10-25.
C. Walter:
Point Process-based Monte Carlo estimation
Statistics and Computing, in press, 1-18.
arXiv preprint arXiv:1412.6368.
J. Skilling:
Nested sampling for general Bayesian computation
Bayesian Analysis, 1(4), 833-859.
M. Huber \& S. Schott:
Using TPA for Bayesian inference
Bayesian Statistics 9, 9, 257.
A. Guyader, N. Hengartner and E. Matzner-Lober:
Simulation and estimation of extreme quantiles and extreme
probabilities
Applied Mathematics \& Optimization, 64(2), 171-196.
1 2 3 4 5 6 7 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.