AKMCS: Active learning reliability method combining Kriging and...

Description Usage Arguments Details Value Note Author(s) References See Also Examples

View source: R/AKMCS.R

Description

Estimate a failure probability with the AKMCS method.

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
AKMCS(
  dimension,
  lsf,
  N = 5e+05,
  N1 = 10 * dimension,
  Nmax = 200,
  Nmin = 2,
  X = NULL,
  y = NULL,
  failure = 0,
  precision = 0.05,
  bayesian = TRUE,
  compute.PPP = FALSE,
  meta_model = NULL,
  kernel = "matern5_2",
  learn_each_train = TRUE,
  crit_min = 2,
  lower.tail = TRUE,
  limit_fun_MH = NULL,
  failure_MH = 0,
  sampling_strategy = "MH",
  first_DOE = "Gaussian",
  seeds = NULL,
  seeds_eval = limit_fun_MH(seeds),
  burnin = 30,
  plot = FALSE,
  limited_plot = FALSE,
  add = FALSE,
  output_dir = NULL,
  verbose = 0
)

Arguments

dimension

dimension of the input space.

lsf

the function defining the failure/safety domain.

N

Monte-Carlo population size.

N1

size of the first DOE.

Nmax

maximum number of calls to the LSF.

Nmin

minimum number of calls during enrichment step.

X

coordinates of already known points.

y

value of the LSF on these points.

failure

failure threshold.

precision

maximum desired cov on the Monte-Carlo estimate.

bayesian

estimate the conditional expectation E_X [ P[meta(X)<failure] ].

compute.PPP

to simulate a Poisson process at each iteration to estimate the conditional expectation and the SUR criteria based on the conditional variance: h (average probability of misclassification at level failure) and I (integral of h over the whole interval [failure, infty))

meta_model

provide here a kriging metamodel from km if wanted.

kernel

specify the kernel to use for km.

learn_each_train

specify if kernel parameters are re-estimated at each train.

crit_min

minimum value of the criteria to be used for refinement.

lower.tail

as for pxxxx functions, TRUE for estimating P(lsf(X) < failure), FALSE for P(lsf(X) > failure)

limit_fun_MH

define an area of exclusion with a limit function.

failure_MH

the theshold for the limit_fun_MH function.

sampling_strategy

either MH for Metropolis-Hastings of AR for accept-reject.

first_DOE

either Gaussian or Uniform, to specify the population on which clustering is done. Set to "No" for no initial DoE (use together with a first DoE given in X for instance).

seeds

if some points are already known to be in the appropriate subdomain.

seeds_eval

value of the metamodel on these points.

burnin

burnin parameter for MH.

plot

set to TRUE for a full plot, ie refresh at each iteration.

limited_plot

set to TRUE for a final plot with final DOE, metamodel and LSF.

add

if plots are to be added to a current device.

output_dir

if plots are to be saved in jpeg in a given directory.

verbose

either 0 for almost no output, 1 for medium size output and 2 for all outputs.

Details

AKMCS strategy is based on a original Monte-Carlo population which is classified with a kriging-based metamodel. This means that no sampling is done during refinements steps. Indeed, it tries to classify this Monte-Carlo population with a confidence greater than a given value, for instance ‘distance’ to the failure should be greater than crit_min standard deviation.

Thus, while this criterion is not verified, the point minimizing it is added to the learning database and then evaluated.

Finally, once all points are classified or when the maximum number of calls has been reached, crude Monte-Carlo is performed. A final test controlling the size of this population regarding the targeted coefficient of variation is done; if it is too small then a new population of sufficient size (considering ordre of magnitude of found probability) is generated, and algorithm run again.

Value

An object of class list containing the failure probability and some more outputs as described below:

p

the estimated failure probability.

cov

the coefficient of variation of the Monte-Carlo probability estimate.

Ncall

the total number of calls to the lsf.

X

the final learning database, ie. all points where lsf has been calculated.

y

the value of the lsf on the learning database.

h

the sequence of the estimated relative SUR criteria.

I

the sequence of the estimated integrated SUR criteria.

meta_fun

the metamodel approximation of the lsf. A call output is a list containing the value and the standard deviation.

meta_model

the final metamodel. An S4 object from DiceKriging. Note that the algorithm enforces the problem to be the estimation of P[lsf(X)<failure] and so using ‘predict’ with this object will return inverse values if lower.tail==FALSE; in this scope prefer using directly meta_fun which handles this possible issue.

points

points in the failure domain according to the metamodel.

meta_eval

evaluation of the metamodel on these points.

z_meta

if plot==TRUE, the evaluation of the metamodel on the plot grid.

Note

Problem is supposed to be defined in the standard space. If not, use UtoX to do so. Furthermore, each time a set of vector is defined as a matrix, ‘nrow’ = dimension and ‘ncol’ = number of vector to be consistent with as.matrix transformation of a vector.

Algorithm calls lsf(X) (where X is a matrix as defined previously) and expects a vector in return. This allows the user to optimise the computation of a batch of points, either by vectorial computation, or by the use of external codes (optimised C or C++ codes for example) and/or parallel computation; see examples in MonteCarlo.

Author(s)

Clement WALTER clementwalter@icloud.com

References

See Also

SubsetSimulation MonteCarlo MetaIS km (in package DiceKriging)

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
## Not run: 
res = AKMCS(dimension=2,lsf=kiureghian,plot=TRUE)

#Compare with crude Monte-Carlo reference value
N = 500000
dimension = 2
U = matrix(rnorm(dimension*N),dimension,N)
G = kiureghian(U)
P = mean(G<0)
cov = sqrt((1-P)/(N*P))

## End(Not run)

#See impact of kernel choice with serial function from Waarts:
waarts = function(u) {
  u = as.matrix(u)
  b1 = 3+(u[1,]-u[2,])^2/10 - sign(u[1,] + u[2,])*(u[1,]+u[2,])/sqrt(2)
  b2 = sign(u[2,]-u[1,])*(u[1,]-u[2,])+7/sqrt(2)
  val = apply(cbind(b1, b2), 1, min)
}

## Not run: 
res = list()
res$matern5_2 = AKMCS(2, waarts, plot=TRUE)
res$matern3_2 = AKMCS(2, waarts, kernel="matern3_2", plot=TRUE)
res$gaussian  = AKMCS(2, waarts, kernel="gauss", plot=TRUE)
res$exp       = AKMCS(2, waarts, kernel="exp", plot=TRUE)

#Compare with crude Monte-Carlo reference value
N = 500000
dimension = 2
U = matrix(rnorm(dimension*N),dimension,N)
G = waarts(U)
P = mean(G<0)
cov = sqrt((1-P)/(N*P))

## End(Not run)

mistral documentation built on April 19, 2021, 1:06 a.m.