klcv | R Documentation |
Model selection criterion based on the leave-one-out cross-validated Kullback-Leibler divergence.
klcv(object, X, scale = 1)
object |
fitted |
X |
the matrix used to compute the empirical variance/covariance matrix. Its dimension is |
scale |
scalar value used to scale the estimated degrees-of-freedom. See below for more details. |
klcv
function implements the leave-one-out cross-validate Kullback-Leibler divergence criterion proposed in Vujacic et al. (2015). For l_1-penalized Gaussian Graphical Models this measure of goodness-of-fit has the following form
klcv(ρ) = -\frac{\ell(\hat K(ρ))}{N} + \frac{\code{scale}}{2N} gdf(\hat K(ρ)),
where \hat K(ρ) is the glasso estimate of the concentration matrix, \ell(\hat K(ρ)) is the corresponding value of the log-likelihood function, scale
is a scale factor for the complexity part, i.e. gdf(\hat K(ρ)), which is defined as
gdf(\hat K(ρ)) = \frac{1}{N-1}∑_{k=1}^N vec\{(\hat K(ρ)^{-1} - S_k)\circ 1_ρ\}'vec[\hat K(ρ)\{(S-S_k)\circ 1_ρ\}\hat K(ρ)].
In the previous expression S is the empirical variance/covariance matrix, S_k = X_k X_k', 1_ρ is a matrix with entries I(\hat k_{ij}(ρ)\ne 0) and \circ is the Hadamard product operator.
klcv
returns an S3 object with calls klcv
, i.e. a named list with the following components:
klcv |
the vector with the leave-one-out cross-validated Kullback-Leibler divergence; |
rho |
the rho-values used to compute the leave-one-out cross-validated Kullback-Leibler divergence; |
loglik |
a vector with the log-likelihood computed for the sequence of weighted l1-penalized RCON(V, E); |
gdf |
a vector returning the generalized degrees-of-freedom; |
scale |
the scale value used to define the leave-one-out cross-validated Kullback-Leibler divergence; |
min.klcv |
minimum value of the leave-one-out cross-validated Kullback-Leibler divergence; |
rho.opt |
the rho-value corresponding to minimum leave-one-out cross-validated Kullback-Leibler divergence; |
rhoid |
the index of the rho-value identified by the leave-one-out cross-validated Kullback-Leibler divergence. |
Luigi Augugliaro
Maintainer: Luigi Augugliaro luigi.augugliaro@unipa.it
Vujacic, I., Abbruzzo, A. and Wit, E. C. (2015) A computationally fast alternative to cross-validation in penalized Gaussian graphical models. J. Stat. Comput. Simul.
sglasso
, loglik
functions and plot.klcv
method.
N <- 100 p <- 5 X <- matrix(rnorm(N * p), N, p) S <- crossprod(X) / N mask <- outer(1:p, 1:p, function(i,j) 0.5^abs(i-j)) mask[1,5] <- mask[1,4] <- mask[2,5] <- NA mask[5,1] <- mask[4,1] <- mask[5,2] <- NA out.sglasso_path <- sglasso(S, mask, tol = 1.0e-13) out.klcv <- klcv(out.sglasso_path, X) out.klcv
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.