Description Usage Arguments Value References See Also Examples
sobol
implements the Monte Carlo estimation of
the Sobol' sensitivity indices. This method allows the estimation of
the indices of the variance decomposition, sometimes referred to as
functional ANOVA decomposition, up to a given order, at a total cost
of (N + 1) * n where N is the number
of indices to estimate. This function allows also the estimation of
the so-called subset indices, i.e. the first-order indices with respect to
single multidimensional inputs.
1 2 3 4 5 6 7 |
model |
a function, or a model with a |
X1 |
the first random sample. |
X2 |
the second random sample. |
order |
either an integer, the maximum order in the ANOVA decomposition (all indices up to this order will be computed), or a list of numeric vectors, the multidimensional compounds of the wanted subset indices. |
nboot |
the number of bootstrap replicates. |
conf |
the confidence level for bootstrap confidence intervals. |
x |
a list of class |
y |
a vector of model responses. |
return.var |
a vector of character strings giving further
internal variables names to store in the output object |
ylim |
y-coordinate plotting limits. |
... |
any other arguments for |
sobol
returns a list of class "sobol"
, containing all
the input arguments detailed before, plus the following components:
call |
the matched call. |
X |
a |
y |
a vector of model responses. |
V |
the estimations of Variances of the Conditional Expectations (VCE) with respect to one factor or one group of factors. |
D |
the estimations of the terms of the ANOVA decomposition (not for subset indices). |
S |
the estimations of the Sobol' sensitivity indices (not for subset indices). |
Users can ask more ouput variables with the argument
return.var
(for example, bootstrap outputs V.boot
,
D.boot
and S.boot
).
I. M. Sobol, 1993, Sensitivity analysis for non-linear mathematical model, Math. Modelling Comput. Exp., 1, 407–414.
1 2 3 4 5 6 7 8 9 10 11 12 | # Test case : the non-monotonic Sobol g-function
# The method of sobol requires 2 samples
# (there are 8 factors, all following the uniform distribution on [0,1])
n <- 1000
X1 <- data.frame(matrix(runif(8 * n), nrow = n))
X2 <- data.frame(matrix(runif(8 * n), nrow = n))
# sensitivity analysis
x <- sobol(model = sobol.fun, X1 = X1, X2 = X2, order = 2, nboot = 100)
print(x)
#plot(x)
|
Call:
sobol(model = sobol.fun, X1 = X1, X2 = X2, order = 2, nboot = 100)
Model runs: 37000
Sobol indices
original bias std. error min. c.i. max. c.i.
X1 0.768511063 0.0007268986 0.05012932 0.658158516 0.86439591
X2 0.140229174 -0.0079044206 0.06060970 0.005782988 0.26092960
X3 0.009363961 -0.0094405404 0.06366132 -0.106907999 0.13609935
X4 -0.031649473 -0.0079924660 0.06232238 -0.148386241 0.09611522
X5 -0.013416823 -0.0082005795 0.06295162 -0.125673398 0.12155684
X6 -0.012123780 -0.0083443713 0.06321758 -0.126608148 0.12353326
X7 -0.012292538 -0.0083576475 0.06307308 -0.125944112 0.12266767
X8 -0.013411463 -0.0084403763 0.06304021 -0.126209293 0.12152019
X1*X2 0.063485348 0.0091612164 0.07678676 -0.097505656 0.20465376
X1*X3 0.028045949 0.0080029406 0.06632299 -0.116126141 0.15032207
X1*X4 0.022779490 0.0082781435 0.06341410 -0.108931127 0.13910053
X1*X5 0.012545655 0.0083380562 0.06294747 -0.122207025 0.12662811
X1*X6 0.013827827 0.0082530899 0.06308891 -0.121148462 0.12763385
X1*X7 0.012520805 0.0083131731 0.06306090 -0.122088169 0.12632693
X1*X8 0.013116935 0.0082074013 0.06310566 -0.121588377 0.12730232
X2*X3 0.007401685 0.0080526384 0.06392742 -0.132974200 0.11864049
X2*X4 0.010088694 0.0082868086 0.06301706 -0.127480077 0.12246903
X2*X5 0.013520112 0.0082490902 0.06308507 -0.121451167 0.12751583
X2*X6 0.013813780 0.0083265357 0.06300070 -0.121117693 0.12791120
X2*X7 0.013839285 0.0082877742 0.06300839 -0.120854822 0.12760082
X2*X8 0.013945021 0.0083525654 0.06304440 -0.121303005 0.12751772
X3*X4 0.013349779 0.0084105878 0.06290093 -0.120268758 0.12674229
X3*X5 0.013570454 0.0082863730 0.06305159 -0.121394473 0.12725812
X3*X6 0.013589242 0.0082791046 0.06304565 -0.121377845 0.12727869
X3*X7 0.013623538 0.0082838482 0.06305192 -0.121457490 0.12734158
X3*X8 0.013734527 0.0082831874 0.06305038 -0.121171112 0.12741414
X4*X5 0.013588374 0.0082801069 0.06304150 -0.121361325 0.12724321
X4*X6 0.013558625 0.0082931185 0.06304126 -0.121388935 0.12725220
X4*X7 0.013493024 0.0082863955 0.06304590 -0.121410591 0.12723550
X4*X8 0.013532779 0.0082905033 0.06304030 -0.121385810 0.12719205
X5*X6 0.013542643 0.0082851877 0.06304827 -0.121394454 0.12725147
X5*X7 0.013538355 0.0082840619 0.06304752 -0.121395010 0.12724692
X5*X8 0.013540262 0.0082847543 0.06304791 -0.121389893 0.12725540
X6*X7 0.013529130 0.0082856975 0.06304874 -0.121408076 0.12724878
X6*X8 0.013536114 0.0082841736 0.06304787 -0.121400209 0.12725417
X7*X8 0.013547608 0.0082859475 0.06304702 -0.121387157 0.12725311
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.