MetaIS: Metamodel based Impotance Sampling

Description Usage Arguments Details Value Note Author(s) References See Also Examples

View source: R/MetaIS.R

Description

Estimate failure probability by MetaIS method.

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
MetaIS(
  dimension,
  lsf,
  N = 5e+05,
  N_alpha = 100,
  N_DOE = 10 * dimension,
  N1 = N_DOE * 30,
  Ru = 8,
  Nmin = 30,
  Nmax = 200,
  Ncall_max = 1000,
  precision = 0.05,
  N_seeds = 2 * dimension,
  Niter_seed = Inf,
  N_alphaLOO = 5000,
  K_alphaLOO = 1,
  alpha_int = c(0.1, 10),
  k_margin = 1.96,
  lower.tail = TRUE,
  X = NULL,
  y = NULL,
  failure = 0,
  meta_model = NULL,
  kernel = "matern5_2",
  learn_each_train = TRUE,
  limit_fun_MH = NULL,
  failure_MH = 0,
  sampling_strategy = "MH",
  seeds = NULL,
  seeds_eval = limit_fun_MH(seeds),
  burnin = 20,
  compute.PPP = FALSE,
  plot = FALSE,
  limited_plot = FALSE,
  add = FALSE,
  output_dir = NULL,
  verbose = 0
)

Arguments

dimension

of the input space

lsf

the failure defining the failure/safety domain

N

size of the Monte-Carlo population for P_epsilon estimate

N_alpha

initial size of the Monte-Carlo population for alpha estimate

N_DOE

size of the initial DOE got by clustering of the N1 samples

N1

size of the initial uniform population sampled in a hypersphere of radius Ru

Ru

radius of the hypersphere for the initial sampling

Nmin

minimum number of call for the construction step

Nmax

maximum number of call for the construction step

Ncall_max

maximum number of call for the whole algorithm

precision

desired maximal value of cov

N_seeds

number of seeds for MH algoritm while generating into the margin ( according to MP*gauss)

Niter_seed

maximum number of iteration for the research of a seed for alphaLOO refinement sampling

N_alphaLOO

number of points to sample at each refinement step

K_alphaLOO

number of clusters at each refinement step

alpha_int

range for alpha to stop construction step

k_margin

margin width; default value means that points are classified with more than 97,5%

lower.tail

specify if one wants to estimate P[lsf(X)<failure] or P[lsf(X)>failure].

X

Coordinates of alredy known points

y

Value of the LSF on these points

failure

Failure threshold

meta_model

Provide here a kriging metamodel from km if wanted

kernel

Specify the kernel to use for km

learn_each_train

Specify if kernel parameters are re-estimated at each train

limit_fun_MH

Define an area of exclusion with a limit function

failure_MH

Threshold for the limit_MH function

sampling_strategy

Either MH for Metropolis-Hastings of AR for accept-reject

seeds

If some points are already known to be in the appropriate subdomain

seeds_eval

Value of the metamodel on these points

burnin

Burnin parameter for MH

compute.PPP

to simulate a Poisson process at each iteration to estimate the conditional expectation and the SUR criteria based on the conditional variance: h (average probability of misclassification at level failure) and I (integral of h over the whole interval [failure, infty))

plot

Set to TRUE for a full plot, ie refresh at each iteration

limited_plot

Set to TRUE for a final plot with final DOE, metamodel and LSF

add

If plots are to be added to a current device

output_dir

If plots are to be saved in jpeg in a given directory

verbose

Either 0 for almost no output, or 1 for medium size or 2 for all outputs

Details

MetaIS is an Important Sampling based probability estimator. It makes use of a kriging surogate to approximate the optimal density function, replacing the indicatrice by its kriging pendant, the probability of being in the failure domain. In this context, the normallizing constant of this quasi-optimal PDF is called the ‘augmented failure probability’ and the modified probability ‘alpha’.

After a first uniform Design of Experiments, MetaIS uses an alpha Leave-One-Out criterion combined with a margin sampling strategy to refine a kriging-based metamodel. Samples are generated according to the weighted margin probability with Metropolis-Hastings algorithm and some are selected by clustering; the N_seeds are got from an accept-reject strategy on a standard population.

Once criterion is reached or maximum number of call done, the augmented failure probability is estimated with a crude Monte-Carlo. Then, a new population is generated according to the quasi-optimal instrumenal PDF; burnin and thinning are used here and alpha is evaluated. While the coefficient of variation of alpha estimate is greater than a given threshold and some computation spots still available (defined by Ncall_max) the estimate is refined with extra calculus.

The final probability is the product of p_epsilon and alpha, and final squared coefficient of variation is the sum of p_epsilon and alpha one's.

Value

An object of class list containing the failure probability and some more outputs as described below:

p

The estimated failure probability.

cov

The coefficient of variation of the Monte-Carlo probability estimate.

Ncall

The total number of calls to the lsf.

X

The final learning database, ie. all points where lsf has been calculated.

y

The value of the lsf on the learning database.

meta_fun

The metamodel approximation of the lsf. A call output is a list containing the value and the standard deviation.

meta_model

The final metamodel. An S4 object from DiceKriging. Note that the algorithm enforces the problem to be the estimation of P[lsf(X)<failure] and so using ‘predict’ with this object will return inverse values if lower.tail==FALSE; in this scope prefer using directly meta_fun which handle this possible issue.

points

Points in the failure domain according to the metamodel.

h

the sequence of the estimated relative SUR criteria.

I

the sequence of the estimated integrated SUR criteria.

Note

Problem is supposed to be defined in the standard space. If not, use UtoX to do so. Furthermore, each time a set of vector is defined as a matrix, ‘nrow’ = dimension and ‘ncol’ = number of vector to be consistent with as.matrix transformation of a vector.

Algorithm calls lsf(X) (where X is a matrix as defined previously) and expects a vector in return. This allows the user to optimise the computation of a batch of points, either by vectorial computation, or by the use of external codes (optimised C or C++ codes for example) and/or parallel computation; see examples in MonteCarlo.

Author(s)

Clement WALTER clementwalter@icloud.com

References

See Also

SubsetSimulation MonteCarlo km (in package DiceKriging)

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
kiureghian = function(x, b=5, kappa=0.5, e=0.1) {
x = as.matrix(x)
b - x[2,] - kappa*(x[1,]-e)^2
}

## Not run: 
res = MetaIS(dimension=2,lsf=kiureghian,plot=TRUE)

#Compare with crude Monte-Carlo reference value
N = 500000
dimension = 2
U = matrix(rnorm(dimension*N),dimension,N)
G = kiureghian(U)
P = mean(G<0)
cov = sqrt((1-P)/(N*P))

## End(Not run)

#See impact of kernel choice with Waarts function :
waarts = function(u) {
  u = as.matrix(u)
  b1 = 3+(u[1,]-u[2,])^2/10 - sign(u[1,] + u[2,])*(u[1,]+u[2,])/sqrt(2)
  b2 = sign(u[2,]-u[1,])*(u[1,]-u[2,])+7/sqrt(2)
  val = apply(cbind(b1, b2), 1, min)
}

## Not run: 
res = list()
res$matern5_2 = MetaIS(2,waarts,plot=TRUE)
res$matern3_2 = MetaIS(2,waarts,kernel="matern3_2",plot=TRUE)
res$gaussian = MetaIS(2,waarts,kernel="gauss",plot=TRUE)
res$exp = MetaIS(2,waarts,kernel="exp",plot=TRUE)

#Compare with crude Monte-Carlo reference value
N = 500000
dimension = 2
U = matrix(rnorm(dimension*N),dimension,N)
G = waarts(U)
P = mean(G<0)
cov = sqrt((1-P)/(N*P))

## End(Not run)

clemlaflemme/test documentation built on Jan. 3, 2020, 9:14 a.m.