gclm: l1 penalized loss estimation for GCLM

View source: R/gclm.R

gclmR Documentation

l1 penalized loss estimation for GCLM

Description

Estimates a sparse continuous time Lyapunov parametrization of a covariance matrix using a lasso (L1) penalty.

Usage

gclm(
  Sigma,
  B = -0.5 * diag(ncol(Sigma)),
  C = rep(1, ncol(Sigma)),
  C0 = rep(1, ncol(Sigma)),
  loss = "loglik",
  eps = 0.01,
  alpha = 0.5,
  maxIter = 100,
  lambda = 0,
  lambdac = 0,
  job = 0
)

gclm.path(
  Sigma,
  lambdas = NULL,
  B = -0.5 * diag(ncol(Sigma)),
  C = rep(1, ncol(Sigma)),
  ...
)

Arguments

Sigma

covariance matrix

B

initial B matrix

C

diagonal of initial C matrix

C0

diagonal of penalization matrix

loss

one of "loglik" (default) or "frobenius"

eps

convergence threshold

alpha

parameter line search

maxIter

maximum number of iterations

lambda

penalization coefficient for B

lambdac

penalization coefficient for C

job

integer 0,1,10 or 11

lambdas

sequence of lambda

...

additional arguments passed to gclm

Details

gclm performs proximal gradient descent for the optimization problem

argmin L(\Sigma(B,C)) + \lambda \rho(B) + \lambda_C ||C - C0||_F^2

subject to B stable and C diagonal, where \rho(B) is the l1 norm of the off-diagonal element of B.

gclm.path simply calls iteratively gclm with different lambda values. Warm start is used, that is in the i-th call to gclm the B and C matrices are initialized as the one obtained in the (i-1)th call.

Value

for gclm: a list with the result of the optimization

for gclm.path: a list of the same length of lambdas with the results of the optimization for the different lambda values

Examples

x <- matrix(rnorm(50*20),ncol=20)
S <- cov(x)

## l1 penalized log-likelihood
res <- gclm(S, eps = 0, lambda = 0.1, lambdac = 0.01)

## l1 penalized log-likelihood with fixed C
res <- gclm(S, eps = 0, lambda = 0.1, lambdac = -1)

## l1 penalized frobenius loss
res <- gclm(S, eps = 0, lambda = 0.1, loss = "frobenius")

gherardovarando/clggm documentation built on April 17, 2023, 10:04 a.m.