multiple.tune.onesample: Model selection for one-sample multiple testing

Description Usage Arguments Details Value Author(s) References See Also Examples

View source: R/multiple.tune.onesample.R

Description

The function multiple.tune.onesample selects the optimal tuning parameters for the multiple testing problem H_0: θ_{r,t}=0 for all 1 ≤ r < t ≤ p.

Usage

1
multiple.tune.onesample(X, B, verbose = FALSE)

Arguments

X

The n x p data matrix.

B

The set of candidate integer multipliers.

verbose

Whether to print out intermediate iteration steps. Default is FALSE.

Details

The false discovery rate control in the multiple testing problem

H_0: θ_{r,t}=0, 1≤ r < t ≤ p,

is based on approximating the number of false discoveries by {2-2Φ(t)}*p*(p-1)/2. Thus the optimal tuning parameter is selected with the principle of making the above approximation error to be as small as possible. The candidate tuning parameters are selected using a data-driven approach, therefore the user only needs to specify a candidate set of integer multipliers. Details on how the approximation error is defined are available in Xia et al. (2014).

Value

error

The empirical version of the approximation error, evaluated over the range of integer multipliers in B.

b

The optimal integer multiplier.

Author(s)

Jing Ma

References

Xia, Y., Cai, T., & Cai, T. T. (2015). Testing differential networks with applications to the detection of gene-gene interactions. Biometrika, 102(2), 247-266.

See Also

testOneBMN

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
library(glmnet)

set.seed(1)

p = 50    # number of variables
n = 100   # number of observations per replicate
n0 = 1000 # burn in tolerance
rho_high = 0.5  # signal strength 
rho_low = 0.1   # signal strength 
ncond = 2       # number of conditions to compare
eps = 8/n       # tolerance for extreme proportioned observations
q = (p*(p - 1))/2

##---(1) Generate the network  
g_sf = sample_pa(p, directed=FALSE)
Amat = as.matrix(as_adjacency_matrix(g_sf, type="both"))

##---(2) Generate the Theta  
weights = matrix(0, p, p)
upperTriangle(weights) = runif(q, rho_low, rho_high) * (2*rbinom(q, 1, 0.5) - 1)
weights = weights + t(weights)
Theta = weights * Amat
dat = BMN.samples(Theta, n, n0, skip=1)
tmp = sapply(1:p, function(i) as.numeric(table(dat[,i]))[1]/n )
while(min(tmp)<eps || abs(1-max(tmp)<eps)){
  dat = BMN.samples(Theta, n, n0, skip=10)
  tmp = sapply(1:p, function(i) as.numeric(table(dat[,i]))[1]/n )
}

tune = multiple.tune.onesample(dat, 1:20, verbose = TRUE)

jingmafdu/TestBMN documentation built on Feb. 20, 2022, 5:24 p.m.