GGMPF: GGM-based heterogeneity analysis.

View source: R/GGMPF.R

GGMPFR Documentation

GGM-based heterogeneity analysis.

Description

The main function of Gaussian graphical model-based heterogeneity analysis via penalized fusion.

Usage

GGMPF(lambda, data, K, initial.selection="K-means", initialize, average=F,
             asymmetric=T, eps = 5e-2, maxiter=10,
             maxiter.AMA=5, local_appro=T, trace = F, penalty = "MCP", theta.fusion=T)

Arguments

lambda

A list, the sequences of the tuning parameters (lambda1, lambda2, and lambda3).

data

n * p matrix, the design matrix.

K

Int, a selected upper bound of K_0.

initial.selection

The different initial values from two clustering methods, which can be selected from c("K-means","dbscan").

initialize

A given initial values, which should be given when initial.selection is not in c("K-means","dbscan").

average

The logical variable, whether to use averaging when integrating parameters that are identified as identical subgroups, the default setting is F, which means the estimated parameters for the subgroup with the largest sample size among the subgroups identified as identical subgroups is used as the final parameter for this subgroup.

asymmetric

The logical variable, symmetry of the precision matrices or not, the default setting is T.

eps

A float value, algorithm termination threshold.

maxiter

Int, maximum number of cycles of the ADMM algorithm.

maxiter.AMA

Int, maximum number of cycles of the AMA algorithm.

local_appro

The logical variable, whether to use local approximations when updating mean parameters, the default setting is T.

trace

The logical variable, whether or not to output the number of identified subgroups during the search for parameters.

penalty

The type of the penalty, which can be selected from c("MCP", "SCAD", "lasso").

theta.fusion

Whether or not the fusion penalty term contains elements of the precision matrices. The default setting is T.

Value

A list including all estimated parameters and the BIC values with all choices of given tuning parameters, and the selected optional parameters.

Author(s)

Mingyang Ren, Sanguo Zhang, Qingzhao Zhang, Shuangge Ma. Maintainer: Mingyang Ren <renmingyang17@mails.ucas.ac.cn>.

References

Ren, M., Zhang S., Zhang Q. and Ma S. (2022). Gaussian Graphical Model-based Heterogeneity Analysis via Penalized Fusion. Biometrics.

Examples


######## Example 1: Generate simulation data and apply this method to analysis #######
n <- 200              # The sample size of each subgroup
p <- 20               # The dimension of the precision matrix
K0 <- 3               # The true number of subgroups
N <- rep(n,K0)        # The sample sizes of K0 subgroups
K <- 6                # The given upper bound of K0.

################ The true parameters ################
mue <- 1.5
nonnum <- 4
mu01 <- c(rep(mue,nonnum),rep(-mue,nonnum),rep(0,p-2*nonnum))
mu02 <- c(rep(mue,2*nonnum),rep(0,p-2*nonnum))
mu03 <- c(rep(-mue,2*nonnum),rep(0,p-2*nonnum))

# Power law network
set.seed(2)
A.list <- Power.law.network(p,s=5,I2=c(1),I3=c(2))
Theta01 <- A.list$A1
Theta02 <- A.list$A2
Theta03 <- A.list$A3
sigma01 <- solve(Theta01)
sigma02 <- solve(Theta02)
sigma03 <- solve(Theta03)
Mu0.list <- list(mu01,mu02,mu03)
Sigma0.list <- list(sigma01,sigma02,sigma03)
Theta0.list <- list(Theta01,Theta02,Theta03)

################ Generating simulated data ################
whole.data <- generate.data(N,Mu0.list,Theta0.list,Sigma0.list)

################ The implementation and evaluation ################
lambda <- genelambda.obo(nlambda1=5,lambda1_max=0.5,lambda1_min=0.1,
                         nlambda2=15,lambda2_max=1.5,lambda2_min=0.1,
                         nlambda3=10,lambda3_max=3.5,lambda3_min=0.5)
res <- GGMPF(lambda, whole.data$data, K, initial.selection="K-means")
Theta_hat.list <- res$Theta_hat.list
Mu_hat.list <- res$Mu_hat.list
prob.list <- res$prob.list
L.mat.list <- res$L.mat.list
opt_num <- res$Opt_num
opt_Theta_hat <- Theta_hat.list[[opt_num]]
opt_Mu_hat <- Mu_hat.list[[opt_num]]
opt_L.mat <- L.mat.list[[opt_num]]
opt_prob <- prob.list[[opt_num]]
K_hat <- dim(opt_Theta_hat)[3]
K_hat


######## Example 2: Call the built-in simulation data set and analyze #######
data(example.data)
K <- 6
lambda <- genelambda.obo(nlambda1=5,lambda1_max=0.5,lambda1_min=0.1,
                         nlambda2=15,lambda2_max=1.5,lambda2_min=0.1,
                         nlambda3=10,lambda3_max=3.5,lambda3_min=0.5)
res <- GGMPF(lambda, example.data$data, K, initial.selection="K-means")
Theta_hat.list <- res$Theta_hat.list
opt_num <- res$Opt_num
opt_Theta_hat <- Theta_hat.list[[opt_num]]
K_hat <- dim(opt_Theta_hat)[3]
K_hat



HeteroGGM documentation built on Oct. 11, 2023, 5:14 p.m.