Description Usage Arguments Value Examples
View source: R/wlr.power.maxcombo.R
Power Calculation for Group Sequential Design Using A Max-combo Test
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 | wlr.power.maxcombo(
n = 600,
r = 1,
DCO = c(24, 36),
alpha = c(0.02, 0.03)/2,
h0 = function(t) { log(2)/12 },
S0 = function(t) { exp(-log(2)/12 * t) },
h1 = function(t) { log(2)/12 * 0.7 },
S1 = function(t) { exp(-log(2)/12 * 0.7 * t) },
f.ws = list(IA1 = list(function(s) { 1 }), FA = list(function(s) { 1 },
function(s) { s * (1 - s) })),
cuts = NULL,
Lambda = function(t) { (t/18)^1.5 * as.numeric(t <= 18) + as.numeric(t > 18) },
G0 = function(t) { 0 },
G1 = function(t) { 0 },
mu.method = "Schoenfeld",
cov.method = "H0"
)
|
n |
Total sample size for two arms. |
r |
Randomization ratio of experimental arm : control arm as r:1. When r = 1, it is equal allocation. Default r = 1. |
DCO |
Analysis time, calculated from first subject in. |
alpha |
Allocated one-sided alpha levels. sum(alpha) is the total type I error. If alpha spending function a(t) is used for information time c(t1, ..., tK), then alpha1 = a(t1), alpha2 = a(t2)-a(t1), ..., alphaK = a(tK)-a(t_K-1), and the total alpha for all analyses is a(tK). |
h0 |
Hazard function of control arm. h0(t) = log(2)/m0 means T~exponential distribution with median m0. |
S0 |
Survival function of control arm. In general, S0(t) = exp(- integral of h0(u) for u from 0 to t). but providing S0(t) can improves computational efficiency and usually the survival function is known in study design. The density function f0(t) = h0(t) * S0(t). |
h1 |
Hazard function of experimental arm. h1(t) = log(2)/m1 means T~exponential distribution with median m0. |
S1 |
Survival function of experimental arm. In general, S1(t) = exp(- integral of h1(u) for u from 0 to t). but providing S1(t) can improves computational efficiency and usually the survival function is known in study design. The density function f1(t) = h1(t) * S1(t). |
f.ws |
Self-defined weight function of survival rate, eg, f.ws = function(s)1/max(s, 0.25) When f.ws is specified, sFH parameter will be ignored. |
cuts |
A vector of cut points to define piecewise distributions. If cuts is not specified or incorrectly specified, it might occasionally have numerical integration issue. |
Lambda |
Cumulative distribution function of enrollment. |
G0 |
Cumulative distribution function of drop-off for control arm, eg, G.ltfu=function(t)1-exp(-0.03/12*t) is the distribution function for 3 percent drop-off in 12 months of followup. |
G1 |
Cumulative distribution function of drop-off for experimental arm, eg, G.ltfu=function(t)1-exp(-0.03/12*t) is the distribution function for 3 percent drop-off in 12 months of followup. |
mu.method |
Method for mean of weighted logrank test Z statistic. "Schoenfeld" or "H1" |
cov.method |
Method for covariance matrix calculation in power calculation: "H0", "H1", "H1.LA" for null hypothesis H0, H1, H1 under local alternative |
b |
Rejection boundary in normalized Z. If b is NULL, then the boundaries will be calculated based on alpha at each analysis time T. Default b = NULL. If b is provided, then alpha is ignored. Default NULL. |
rho |
Parameter for Fleming-Harrington (rho, gamma) weighted log-rank test. |
gamma |
Parameter for Fleming-Harrington (rho, gamma) weighted log-rank test. For log-rank test, set rho = gamma = 0. |
tau |
Cut point for stabilized FH test, sFH(rho, gamma, tau); with weight function defined as w(t) = s_tilda^rho*(1-s_tilda)^gamma, where s_tilda = max(s(t), s.tau) or max(s(t), s(tau)) if s.tau = NULL tau = Inf reduces to regular Fleming-Harrington test(rho, gamma) |
s.tau |
Survival rate cut S(tau) at t = tau1; default 0.5, ie. cut at median. s.tau = 0 reduces to regular Fleming-Harrington test(rho, gamma) |
An object with dataframes below.
n: Total number of subjects for two arms
DCO: Expected analysis time
targetEvents: Expected number of events
power: Power of the max-combo test at each analysis
overall.power Overall power of the study
incr.power Incremental power for each analysis. The sum of all incremental powers is the overall power.
medians: Median of each treatment group
b: Expected rejection boundary in z value
Expected_HR: Expected HR
Omega0: Covariance matrix under H0; Omega1: Covariance matrix for power calculation request in cov.method.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 | #Distributions for both arms
m0 = 12; #median OS for control arm
lambda0 = log(2) / m0
h0 = function(t){lambda0};
S0 = function(t){exp(-lambda0 * t)}
HRd = 0.60 #hazard ratio after delay
h.D3=function(t){lambda0*as.numeric(t<3)+HRd*lambda0*as.numeric(t>=3)}
c3 = exp(-3*lambda0*(1-HRd));
S.D3 = function(t){S0(t)*as.numeric(t<3)+c3*exp(-HRd*lambda0*t)*as.numeric(t>=3)}
#Define weight functions for weighted log-rank tests
lr = function(s){1}
fh01 = function(s){(1-s)}
fh11 = function(s){s*(1-s)}
#Enrollment
Lambda = function(t){(t/18)^1.5*as.numeric(t <= 18) + as.numeric(t > 18)};
G0 = function(t){0}; G1 = function(t){0};
#Schoenfeld method with power based on covariance matrix under H0
wlr.power.maxcombo(DCO = c(24, 36),
alpha=c(0.01, 0.04)/2,
r = 1, n = 500,
h0 = h0, S0=S0,h1 = h.D3, S1= S.D3,
f.ws = list(IA1 = list(lr), FA=list(fh01)),
Lambda=Lambda, G0=G0, G1=G1,
mu.method = "Schoenfeld", cov.method = "H0")
#Schoenfeld method with power based on covariance matrix under H1
wlr.power.maxcombo(DCO = c(24, 36),
alpha=c(0.01, 0.04)/2,
r = 1, n = 500,
h0 = h0, S0=S0,h1 = h.D3, S1= S.D3,
f.ws = list(IA1 = list(lr), FA=list(fh01)),
Lambda=Lambda, G0=G0, G1=G1,
mu.method = "Schoenfeld", cov.method = "H1")
#Schoenfeld method with power based on covariance matrix under H1 in Local Alternative (simplified)
wlr.power.maxcombo(DCO = c(24, 36),
alpha=c(0.01, 0.04)/2,
r = 1, n = 500,
h0 = h0, S0=S0,h1 = h.D3, S1= S.D3,
f.ws = list(IA1 = list(lr), FA=list(fh01)),
Lambda=Lambda, G0=G0, G1=G1,
mu.method = "Schoenfeld", cov.method = "H1.LA")
#Mean(Z) under H1 with power based on covariance matrix under H0
wlr.power.maxcombo(DCO = c(24, 36),
alpha=c(0.01, 0.04)/2,
r = 1, n = 500,
h0 = h0, S0=S0,h1 = h.D3, S1= S.D3,
f.ws = list(IA1 = list(lr), FA=list(fh01)),
Lambda=Lambda, G0=G0, G1=G1,
mu.method = "H1", cov.method = "H0")
#Mean(Z) under H1 with power based on covariance matrix under H1
wlr.power.maxcombo(DCO = c(24, 36),
alpha=c(0.01, 0.04)/2,
r = 1, n = 500,
h0 = h0, S0=S0,h1 = h.D3, S1= S.D3,
f.ws = list(IA1 = list(lr), FA=list(fh01)),
Lambda=Lambda, G0=G0, G1=G1,
mu.method = "H1", cov.method = "H1")
#Mean(Z) under H1 with power based on covariance matrix under H1 in Local Alternative (simplified)
wlr.power.maxcombo(DCO = c(24, 36),
alpha=c(0.01, 0.04)/2,
r = 1, n = 500,
h0 = h0, S0=S0,h1 = h.D3, S1= S.D3,
f.ws = list(IA1 = list(lr), FA=list(fh01)),
Lambda=Lambda, G0=G0, G1=G1,
mu.method = "H1", cov.method = "H1.LA")
#max-combo(logrank, FH11) at FA
#Mean(Z) under H1 with power based on covariance matrix under H1 in Local Alternative (simplified)
wlr.power.maxcombo(DCO = c(24, 36),
alpha=c(0.01, 0.04)/2,
r = 1, n = 500,
h0 = h0, S0=S0,h1 = h.D3, S1= S.D3,
f.ws = list(IA1 = list(lr), FA=list(lr, fh11)),
Lambda=Lambda, G0=G0, G1=G1,
mu.method = "H1", cov.method = "H1.LA")
#max-combo(logrank, FH11) at FA
#Mean(Z) under H1 with power based on covariance matrix under H1
wlr.power.maxcombo(DCO = c(24, 36),
alpha=c(0.01, 0.04)/2,
r = 1, n = 500,
h0 = h0, S0=S0,h1 = h.D3, S1= S.D3,
f.ws = list(IA1 = list(lr), FA=list(lr, fh11)),
Lambda=Lambda, G0=G0, G1=G1,
mu.method = "H1", cov.method = "H1")
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.