vrPEER: Graph-constrained regression with variable-reduction...

Description Usage Arguments Value References Examples

Description

Graph-constrained regression with variable-reduction procedure to handle the non-invertibility of a graph-originated penalty matrix (see: References).

Bootstrap confidence intervals computation is available (not set as a default option).

Usage

1
2
3
vrPEER(Q, y, Z, X = NULL, sv.thr = 1e-05, compute.boot.CI = FALSE,
  boot.R = 1000, boot.conf = 0.95, boot.set.seed = TRUE,
  boot.parallel = "multicore", boot.ncpus = 4, verbose = TRUE)

Arguments

Q

graph-originated penalty matrix (p \times p); typically: a graph Laplacian matrix

y

response values matrix (n \times 1)

Z

design matrix (n \times p) modeled as random effects variables (to be penalized in regression modeling); assumed to be already standarized

X

design matrix (n \times k) modeled as fixed effects variables (not to be penalized in regression modeling); should contain colum of 1s if intercept is to be considered in a model

sv.thr

threshold value above which singular values of Q are considered "zeros"

compute.boot.CI

logical whether or not compute bootstrap confidence intervals for b regression coefficient estimates

boot.R

number of bootstrap replications used in bootstrap confidence intervals computation

boot.conf

confidence level assumed in bootstrap confidence intervals computation

boot.set.seed

logical whether or not set seed in bootstrap confidence intervals computation

boot.parallel

value of parallel argument in boot function in bootstrap confidence intervals computation

boot.ncpus

value of ncpus argument in boot function in bootstrap confidence intervals computation

verbose

logical whether or not set verbose mode (print out function execution messages)

Value

b.est

vector of b coefficient estimates

beta.est

vector of β coefficient estimates

lambda.Q

λ_Q regularization parameter value

boot.CI

data frame with two columns, lower and upper, containing, respectively, values of lower and upper bootstrap confidence intervals for b regression coefficient estimates

References

Karas, M., Brzyski, D., Dzemidzic, M., J., Kareken, D.A., Randolph, T.W., Harezlak, J. (2017). Brain connectivity-informed regularization methods for regression. doi: https://doi.org/10.1101/117945

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
set.seed(1234)
n <- 200
p1 <- 10
p2 <- 90
p <- p1 + p2
# Define graph adjacency matrix
A <- matrix(rep(0, p*p), nrow = p, ncol = p)
A[1:p1, 1:p1] <- 1
A[(p1+1):p, (p1+1):p] <- 1
L <- Adj2Lap(A)
# Define Q penalty matrix as graph Laplacian matrix normalized)
Q <- L2L.normalized(L)
# Define Z,X design matrices and aoutcome y
Z <- matrix(rnorm(n*p), nrow = n, ncol = p)
b.true <- c(rep(1, p1), rep(0, p2))
X <- matrix(rnorm(n*3), nrow = n, ncol = 3)
beta.true <- runif(3)
intercept <- 0
eta <- intercept + Z %*% b.true + X %*% beta.true
R2 <- 0.5
sd.eps <- sqrt(var(eta) * (1 - R2) / R2)
error <- rnorm(n, sd = sd.eps)
y <- eta + error

## Not run: 
# run vrPEER 
vrPEER.out <- vrPEER(Q, y, Z, X)
plt.df <- data.frame(x = 1:p, 
                     y = vrPEER.out$b.est)
ggplot(plt.df, aes(x = x, y = y, group = 1)) + geom_line()

## End(Not run)

## Not run: 
# run vrPEER with 0.95 confidence intrvals 
vrPEER.out <- vrPEER(Q, y, Z, X, compute.boot.CI = TRUE, boot.R = 500)
plt.df <- data.frame(x = 1:p, 
                     y = vrPEER.out$b.est, 
                     lo = vrPEER.out$boot.CI[,1], 
                     up =  vrPEER.out$boot.CI[,2])
ggplot(plt.df, aes(x = x, y = y, group = 1)) + geom_line() +  
  geom_ribbon(aes(ymin=lo, ymax=up), alpha = 0.3)

## End(Not run)

mdpeer documentation built on May 2, 2019, 3:36 p.m.