Description Usage Arguments Details Note Examples
View source: R/6-UserBayesFunctions.R
Plots the sensitivity (derivative) function and calculates the efficiency lower bound (ELB) for a given Bayesian design. Let x belongs to χ that denotes the design space. Based on the general equivalence theorem, a design ξ* is optimal if and only if the value of the sensitivity (derivative) function is non-positive for all x in χ and zero when x belongs to the support of ξ* (be equal to the one of the design points).
For an approximate (continuous) design, when the design space is one or two-dimensional, the user can visually verify the optimality of the design by observing the sensitivity plot. Furthermore, the proximity of the design to the optimal design can be measured by the ELB without knowing the latter.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 | sensbayes(
formula,
predvars,
parvars,
family = gaussian(),
x,
w,
lx,
ux,
fimfunc = NULL,
prior = list(),
sens.control = list(),
sens.bayes.control = list(),
crt.bayes.control = list(),
plot_3d = c("lattice", "rgl"),
plot_sens = TRUE,
npar = NULL,
calculate_criterion = TRUE,
silent = FALSE,
crtfunc = NULL,
sensfunc = NULL
)
|
formula |
A linear or nonlinear model |
predvars |
A vector of characters. Denotes the predictors in the |
parvars |
A vector of characters. Denotes the unknown parameters in the |
family |
A description of the response distribution and the link function to be used in the model.
This can be a family function, a call to a family function or a character string naming the family.
Every family function has a link argument allowing to specify the link function to be applied on the response variable.
If not specified, default links are used. For details see |
x |
A vector of candidate design (support) points.
When is not set to |
w |
Vector of the corresponding design weights for |
lx |
Vector of lower bounds for the predictors. Should be in the same order as |
ux |
Vector of upper bounds for the predictors. Should be in the same order as |
fimfunc |
A function. Returns the FIM as a |
prior |
An object of class |
sens.control |
Control Parameters for Calculating the ELB. For details, see |
sens.bayes.control |
A list. Control parameters to verify the general equivalence theorem. For details, see |
crt.bayes.control |
A list. Control parameters to approximate the integral in the Bayesian criterion at a given design over the parameter space.
For details, see |
plot_3d |
Which package should be used to plot the sensitivity (derivative) function for two-dimensional design space. Defaults to |
plot_sens |
Plot the sensitivity (derivative) function? Defaults to |
npar |
Number of model parameters. Used when |
calculate_criterion |
Calculate the optimality criterion? See 'Details' of |
silent |
Do not print anything? Defaults to |
crtfunc |
(Optional) a function that specifies an arbitrary criterion. It must have especial arguments and output. See 'Details' of |
sensfunc |
(Optional) a function that specifies the sensitivity function for |
Let Ξ be the space of all approximate designs with k design points (support points) at x1, x2, ..., xk from design space χ with corresponding weights w1, . . . ,wk. Let M(ξ, θ) be the Fisher information matrix (FIM) of a k-point design ξ and π(θ) is a user-given prior distribution for the vector of unknown parameters θ. A design ξ* is Bayesian D-optimal among all designs on χ if and only if the following inequality holds for all x belong to χ
c(x, ξ*) = integration over Θ tr M^-1(ξ*, θ)I(x, θ)-p <= 0,
with equality at all support points of ξ*. Here, p is the number of model parameters. c(x, ξ*) is called sensitivity or derivative function.
Depending on the complexity of the problem at hand, sometimes, the CPU time can be considerably reduced
by choosing a set of less conservative values for the tuning parameters tol
and maxEval
in
the function sens.bayes.control
when sens.bayes.control$method = "cubature"
.
Similarly, this applies when sens.bayes.control$method = "quadrature"
.
In general, if the CPU time matters, the user should find an appropriate speed-accuracy trade-off for her/his own problem.
See 'Examples' for more details.
The default values of the tuning parameters in sens.bayes.control
are set in a way that
having accurate plots for the sensitivity (derivative) function
and calculating the ELB to a high precision to be the primary goals,
although the process may take too long. The user should choose a set of less conservative values
via the argument sens.bayes.control
when the CPU-time is too long or matters.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 | ##################################################################
# Checking the Bayesian D-optimality of a design for the 2Pl model
##################################################################
skew2 <- skewnormal(xi = c(0, 1), Omega = matrix(c(1, -0.17, -0.17, .5), nrow = 2),
alpha = c(-1, 0), lower = c(-3, .1), upper = c(3, 2))
## Not run:
sensbayes(formula = ~1/(1 + exp(-b *(x - a))),
predvars = "x", parvars = c("a", "b"),
family = binomial(),
x= c(-2.50914, -1.16780, -0.36904, 1.29227),
w =c(0.35767, 0.11032, 0.15621, 0.37580),
lx = -3, ux = 3,
prior = skew2)
# took 29 seconds on my system!
## End(Not run)
# It took very long.
# We re-adjust the tuning parameters in sens.bayes.control to be faster
# See how we drastically reduce the maxEval and increase the tolerance
## Not run:
sensbayes(formula = ~1/(1 + exp(-b *(x - a))),
predvars = "x", parvars = c("a", "b"),
family = binomial(),
x= c(-2.50914, -1.16780, -0.36904, 1.29227),
w =c(0.35767, 0.11032, 0.15621, 0.37580),
lx = -3, ux = 3,prior = skew2,
sens.bayes.control = list(cubature = list(tol = 1e-4, maxEval = 300)))
# took 5 Seconds on my system!
## End(Not run)
# Compare it with the following:
sensbayes(formula = ~1/(1 + exp(-b *(x - a))),
predvars = "x", parvars = c("a", "b"),
family = binomial(),
x= c(-2.50914, -1.16780, -0.36904, 1.29227),
w =c(0.35767, 0.11032, 0.15621, 0.37580),
lx = -3, ux = 3,prior = skew2,
sens.bayes.control = list(cubature = list(tol = 1e-4, maxEval = 200)))
# Look at the plot!
# took 3 seconds on my system
########################################################################################
# Checking the Bayesian D-optimality of a design for the 4-parameter sigmoid emax model
########################################################################################
lb <- c(4, 11, 100, 5)
ub <- c(9, 17, 140, 10)
## Not run:
sensbayes(formula = ~ theta1 + (theta2 - theta1)*(x^theta4)/(x^theta4 + theta3^theta4),
predvars = c("x"), parvars = c("theta1", "theta2", "theta3", "theta4"),
x = c(0.78990, 95.66297, 118.42964,147.55809, 500),
w = c(0.23426, 0.17071, 0.17684, 0.1827, 0.23549),
lx = .001, ux = 500, prior = uniform(lb, ub))
# took 200 seconds on my system
## End(Not run)
# Re-adjust the tuning parameters to have it faster
## Not run:
sensbayes(formula = ~ theta1 + (theta2 - theta1)*(x^theta4)/(x^theta4 + theta3^theta4),
predvars = c("x"), parvars = c("theta1", "theta2", "theta3", "theta4"),
x = c(0.78990, 95.66297, 118.42964,147.55809, 500),
w = c(0.23426, 0.17071, 0.17684, 0.1827, 0.23549),
lx = .001, ux = 500, prior = uniform(lb, ub),
sens.bayes.control = list(cubature = list(tol = 1e-3, maxEval = 300)))
# took 4 seconds on my system. See how much it makes difference
## End(Not run)
## Not run:
# Now we try it with quadrature. Default is 6 nodes
sensbayes(formula = ~ theta1 + (theta2 - theta1)*(x^theta4)/(x^theta4 + theta3^theta4),
predvars = c("x"), parvars = c("theta1", "theta2", "theta3", "theta4"),
x = c(0.78990, 95.66297, 118.42964,147.55809, 500),
w = c(0.23426, 0.17071, 0.17684, 0.1827, 0.23549),
sens.bayes.control = list(method = "quadrature"),
lx = .001, ux = 500, prior = uniform(lb, ub))
# 166.519 s
# use less number of nodes to see if we can reduce the CPU time
sensbayes(formula = ~ theta1 + (theta2 - theta1)*(x^theta4)/(x^theta4 + theta3^theta4),
predvars = c("x"), parvars = c("theta1", "theta2", "theta3", "theta4"),
x = c(0.78990, 95.66297, 118.42964,147.55809, 500),
w = c(0.23426, 0.17071, 0.17684, 0.1827, 0.23549),
sens.bayes.control = list(method = "quadrature",
quadrature = list(level = 3)),
lx = .001, ux = 500, prior = uniform(lb, ub))
# we don't have an accurate plot
# use less number of levels: use 4 nodes
sensbayes(formula = ~ theta1 + (theta2 - theta1)*(x^theta4)/(x^theta4 + theta3^theta4),
predvars = c("x"), parvars = c("theta1", "theta2", "theta3", "theta4"),
x = c(0.78990, 95.66297, 118.42964,147.55809, 500),
w = c(0.23426, 0.17071, 0.17684, 0.1827, 0.23549),
sens.bayes.control = list(method = "quadrature",
quadrature = list(level = 4)),
lx = .001, ux = 500, prior = uniform(lb, ub))
## End(Not run)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.