mvlsfit: Least Squares Estimates for Mean and Covariance in...

Description Usage Arguments Value References Examples

View source: R/mvlsfit.R

Description

mvlsfit returns least squares estimates for the mean and covariance using an iterative process.

Usage

1
mvlsfit(data,time,j,p,r,s,lags,X.mat,U,V,beta.init,gamma.init,alpha.init,beta=c(rep(10,length(beta.init))),tol=0.001,Nmax=100)

Arguments

data

a data frame (or matrix) with n rows for subjects and T columns for the repeated measurements.

time

a vector with T equally or unequally spaced time points.

j

a positive integer for the number of outcomes.

p

a positive integer for the order of time polynomial used for average or the number of covariates.

r

a positive integer for the order of time-lag polynomial in constructing design matrix for regressogram.

s

a positive integer for the order of time polynomial in constructing design matrix for innovariogram.

lags

a vector with lags corresponding to all time points.

X.mat

design matrix for modeling the mean.

U

design matrix for modeling the regressogram elements.

V

design matrix for modeling the log innovariogram elements.

beta.init

a vector with initial values for the mean parameter.

gamma.init

a matrix with rj rows and j columns with initial estimates for parameters in the regressogram models.

alpha.init

a vector with initial estimates for parameters in the log innovariogram models.

beta

a vector with initial values for the mean parameters. To initial the algorithm the default is a vector of zeros.

tol

a double or real number indicating the tolerance to decide convergence of mean and covariance parameters. The default is 0.001.

Nmax

a positive integers indicating the maximum number of iterations for the algorithm if it does not converge before. The default is 500.

Value

estimates for mean and covariance parameters in matrix form, estimated mean vector and covariance matrix, and number of iterations. Specifically,

References

Kohli, P. Garcia, T. and Pourahmadi, M. 2016 Modeling the Cholesky Factors of Covariance Matrices of Multivariate Longitudinal Data, Journal of Multivariate Analysis, 145, 87-100.

Kohli, P. Du, X. and Shen, H. 2020+ Multivariate Longitudinal Graphical Models (MLGM): An R Package for Visualizing and Modeling Mean \& Dependence Patterns in Multivariate Longitudinal Data, submitted.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
data(Tcells)
time <- c(0, 2, 4, 6, 8, 18, 24, 32, 48, 72)
t <- length(time)
j <- 4
n <- 44
p <- 4
r <- 3
s <- 2
X <- poly(time,degree=p)
X <- cbind(rep(1,t),X)
X.mat <- matrix(0,nrow=1,ncol=(p+1)*j)
for(i in 1:t){
 X.mat <- rbind(X.mat,kronecker(diag(j),t(X[i,])))
}
X.mat <- X.mat[-1,]
X.mat <- X.mat[,-c(15,20)]
design_mat <- mvrmat(time,j,r,s)
gene.names <- c("FYB", "CD69", "IL2RG", "CDC2")
degree.mean <- c(4,4,3,3)
beta.init <- 0
for(i in 1:j){
 data.gene <- Tcells[,seq(i, ncol(Tcells), j)]
 mean.gene <- apply(data.gene,2,mean)
 model <-  polyfit(mean.gene,time,degree=degree.mean[i],plot=FALSE,title="",xlabel="Time points",ylabel="Expression Response",indicator=FALSE, intercept=TRUE,pch.plot=19,lwd.fit=2,lty.fit=2)
 nc <- degree.mean[i] + 1
 beta.init <- c(beta.init,model$output$coefficients[1:nc])
}
beta.init <- beta.init[-1]
MVR <- mvr(Tcells,time,j,n,inno=FALSE,inverse=FALSE,loginno=TRUE,plot=FALSE,pch.plot=19,par1.r = 4,par2.r = 4,par1.d=4,par2.d=4)
j2 <- j^2
lags <- 0
for(i in 1:(t-1)){
time.lags <- diff(time,i)
lags <- c(lags,rep(time.lags))
}
lags <- lags[-1]
gamma.init <- matrix(0,nrow=j2,ncol=r+1)
# par(mfrow=c(4,4))
for(i in 1:j2){
 data.phi <- MVR$Phi.plot[[i]]
 best.model <- polyfit(data.phi,lags,degree=r,plot=FALSE,title=gene.names[i],xlabel="Time points",ylabel="Expression Response",indicator=FALSE,index=1,intercept=TRUE)
 gamma.init[i,] <- best.model$output$coefficients[1:(r+1)]
 # if wanted to see initial fits
 # best.model <- polyfit(data.phi,lags,degree=r,plot=TRUE,title=gene.names[i],xlabel="Time points",ylabel="Expression Response",indicator=FALSE,index=1,intercept=TRUE)
 # legend("bottomright",legend=c("observed","fitted"),col=c("black","red"),lty=c(1,2),lwd=c(2,2),cex=0.7)
}
gamma.mat <- matrix(0,((r+1)*j),j)
for(i in 1:j){
row <- ((i-1)*j)+1
col <- ((i-1)*(r+1)) +1
gamma.mat[col:(i*(r+1)),1:j] <- t(gamma.init[row:(i*j),(1:(r+1))])
}
# par(mfrow=c(4,3))
j3 <- j*(j+1)/2
alpha.init <- 0
for(i in 1:j3){
 data.logD <- MVR$logD.elements[i,]
 model.best <- polyfit(data.logD,time,degree=s,plot=FALSE,title="",xlabel="Time points",ylabel="Expression Response",indicator=FALSE,index=4,intercept=TRUE)
 alpha.init <- c(alpha.init,model.best$output$coefficients[1:(s+1)])
 # if wanted to see initial fits
 # model.best <- polyfit(data.logD,time,degree=s,plot=TRUE,title="",xlabel="Time points",ylabel="Expression Response",indicator=FALSE,index=4,intercept=TRUE)
 # legend("topright",legend=c("observed","fitted"),col=c("black","red"),lty=c(1,2),lwd=c(2,2),cex=0.8)
}
alpha.init <- alpha.init[-1]
results <- mvlsfit(Tcells,time,j,p,r,s,lags,X.mat,design_mat$U,design_mat$V,beta.init,gamma.init,alpha.init,beta=matrix(0,nrow=length(beta.init),ncol=1),tol=0.001,Nmax=500)
par.est <- results$output
mean.est <- results$mean
cov.est <- results$sigma

priyakohli5/MLGM documentation built on April 24, 2021, 4:22 p.m.