desparsify: De-Sparsified Graphical Lasso Estimator

Description Usage Arguments Details Value Note References Examples

View source: R/desparsify.R

Description

\loadmathjax

Compute the de-sparsified (sometimes called "de-biased") glasso estimator with the approach described in Equation 7 of \insertCitejankova2015confidence;textualGGMncv. The basic idea is to undo \mjseqnL_1-regularization, in order to compute p-values and confidence intervals (i.e., to make statistical inference).

Usage

1
desparsify(object, ...)

Arguments

object

An object of class ggmncv.

...

Currently ignored.

Details

According to \insertCitejankova2015confidence;textualGGMncv, the de-sparisifed estimator, \mjseqn\hat\mathrm\bf T, is defined as

\mjseqn\hat\mathrm\bf

T = 2\hat\boldsymbol\Theta - \hat\boldsymbol\Theta\hat\mathrm\bf R\hat\boldsymbol\Theta,

where \mjseqn\hat\boldsymbol\Theta denotes the graphical lasso estimator of the precision matrix and \mjseqn\hat\mathrm\bf R is the sample correlation matrix. Further details can be found in Section 2 ("Main Results") of \insertCitejankova2015confidence;textualGGMncv.

This approach is built upon earlier work on the de-sparsified lasso estimator \insertCitejavanmard2014confidence,van2014asymptotically,zhang2014confidenceGGMncv

Value

The de-sparsified estimates, including

Note

This assumes (reasonably) Gaussian data, and should not to be expected to work for, say, polychoric correlations. Further, all work to date has only looked at the graphical lasso estimator, and not de-sparsifying nonconvex regularization. Accordingly, it is probably best to set penalty = "lasso" in ggmncv.

This function only provides the de-sparsified estimator and not p-values or confidence intervals (see inference).

References

\insertAllCited

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
# data
Y <- GGMncv::Sachs[,1:5]

n <- nrow(Y)
p <- ncol(Y)

# fit model
# note: fix lambda, as in the reference
fit <- ggmncv(cor(Y), n = nrow(Y),
              progress = FALSE,
              penalty = "lasso",
              lambda = sqrt(log(p)/n))

# fit model
# note: no regularization
fit_non_reg <- ggmncv(cor(Y), n = nrow(Y),
                      progress = FALSE,
                      penalty = "lasso",
                      lambda = 0)


# remove (some) bias and sparsity
That <- desparsify(fit)

# graphical lasso estimator
fit$P

# de-sparsified estimator
That$P

# mle
fit_non_reg$P

donaldRwilliams/GGMncv documentation built on Dec. 28, 2021, 5:20 p.m.