DWLasso: Degree weighted lasso

Description Usage Arguments Details Value Author(s) References Examples

View source: R/DWLasso.R

Description

Infers undirected networks with hubs using weighted nodewise regression approach. The method contains two parameters that control hub and overall sparsity, respectively.

Usage

1
DWLasso(X, lambda1 = 0.4, lambda2 = 2, a = 1, tol = 1e-05)

Arguments

X

An input matrix. The columns represent variables and the rows indicate observations.

lambda1

A penalty parameter that controls degree sparsity of the inferred network

lambda2

A penalty parameter that controls overall sparsity of the inferred network

a

A parameter of the update equation that controls the convergence of weights

tol

Tolerance

Details

This implements weighted degree lasso using coordinate descent algorithm (implemented in Glmnet package) described in Sulaimanov et al.. The method is based on the weighted nodewise regression approach and infers large undirected networks with hubs in iterative manner in the setting more variables than samples (p>n). Given p variables, the network is inferred by regressing each variable against the remaining (p-1) variables. The penalty parameter, lambda1 controls the degree sparsity of the network, whereas the penalty parameter, lambda2 controls the overall sparsity.The method uses a fast Lasso solver Glmnet (Friedman et al. (2010)) with default settings.

Value

mat

The estimated matrix corresponding to the inferred network. The diagonal elements of the matrix are zero

weights

The estimated weights used to estimate the network. These weights are computed from the degree of estimated networks

lambda1

The value of the penalty parameter controlling degree sparsity of the inferred network.

lambda2

The value of the penalty parameter controlling the overall sparsity

Author(s)

Nurgazy Sulaimanov, Sunil Kumar, Frederic Burdet, Mark Ibberson, Marco Pagni, Heinz Koeppl.

Maintainer: Nurgazy Sulaimanov, nurgazy.sulaimanov@bcs.tu-darmstadt.de

References

1. Nurgazy Sulaimanov, Sunil Kumar, Frederic Burdet, Mark Ibberson, Marco Pagni, Heinz Koeppl. Inferring hub networks using weighted degree Lasso. http://arxiv.org/abs/1710.01912.

2. Jerome Friedman, Trevor Hastie, Robert Tibshirani (2010). Regularization Paths for Generalized Linear Models via Coordinate Descent. Journal of Statistical Software, 33(1), 1-22. URL http://www.jstatsoft.org/v33/i01/.

3. Tan, KM., London, P., Mohan, K., Lee, S-I., Fazel, M., and Witten, D. (2014). Learning graphical models with hubs. Journal of Machine Learning Research. 5.1 (2014): 3297-3331.

4. Meinshausen, Nicolai, and Peter B<c3><bc>hlmann. "High-dimensional graphs and variable selection with the lasso." The annals of statistics (2006): 1436-1462.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
library(DWLasso)
library(glmnet)
library(hglasso)


# Generate inverse covariance matrix with 3 hubs
# 20 % of the elements within a hub are zero
# 97 % of the elements that are not within hub nodes are zero
p <- 60 # Number of variables
n <- 40 # Number of samples

hub_number = 3  # Number of hubs

# Generate the adjacency matrix
Theta <- HubNetwork(p,0.97,hub_number,0.2)$Theta

# Generate a data matrix
out <- rmvnorm(n,rep(0,p),solve(Theta))

# Standardize the data
dat <- scale(out)

# Run DWLasso
out.p <- DWLasso(dat, lambda1 = 0.6, lambda2 = 10)

# print out a summary of the output
summary(out.p)

Example output

Loading required package: Matrix
Loading required package: foreach
Loaded glmnet 2.0-16

Loading required package: glasso
Loading required package: mvtnorm
Loading required package: igraph

Attaching package: 'igraph'

The following objects are masked from 'package:stats':

    decompose, spectrum

The following object is masked from 'package:base':

    union

        Length Class  Mode   
mat     3600   -none- numeric
weights   60   -none- numeric
lambda1    1   -none- numeric
lambda2    1   -none- numeric

DWLasso documentation built on May 2, 2019, 7:27 a.m.

Related to DWLasso in DWLasso...