em.semi: EM algorithm to compute maximum likelihood estimate of...

Description Usage Arguments Value References Examples

Description

EM algorithm to compute maximum likelihood estimate of Gaussian hidden semi-Markov models with / without autoregressive structures and with / without regularization on the covariance matrices and/or autoregressive structures.

Usage

1
2
em.semi(y, mod, ntimes = NULL, tol = 1e-04, maxit = 100, arp = 0,
  cov.shrink = 0, auto.lambda = 0, auto.alpha = 0, print = TRUE)

Arguments

y

observed series

mod

list consisting the at least the following items: mod$m = scalar number of states, mod$delta = vector of initial values for prior probabilities, mod$gamma = matrix of initial values for state transition probabilies. mod$mu = list of initial values for means, mod$sigma = list of initial values for covariance matrices. mod$d = list of state duration probabilities. For autoregressive hidden markov models, we also need the additional items: mod$arp = scalar order of autoregressive structure mod$auto = list of initial values for autoregressive coefficient matrices

ntimes

length of each homogeneous time series. Default to NULL, which means only homogeneous time series.

tol

tolerance for relative change. Default to 1e-4.

maxit

maximum number of iterations. Default to 100.

arp

order of autoregressive. Default to 0.

cov.shrink

shrinkage on the multivariate normal covariance matrix. Default to 0. See references.

auto.lambda

elastic net shrinkage on the autoregressive coefficients. Default to 0. See references.

auto.alpha

The elasticnet mixing parameter, with 0<=alpha<=1. The penalty is defined as (1-alpha)/2||.||_2^2+alpha||.||_1. alpha=1 is the lasso penalty, and alpha=0 the ridge penalty. Default to 0. Same as in the glmnet package.

print

Default to TRUE.

Value

a list containing the fitted parameters.

References

Rabiner, Lawrence R. "A tutorial on hidden Markov models and selected applications in speech recognition." Proceedings of the IEEE 77.2 (1989): 257-286.

Zou, Hui, and Trevor Hastie. "Regularization and variable selection via the elastic net." Journal of the Royal Statistical Society: Series B (Statistical Methodology) 67.2 (2005): 301-320.

Ledoit, Olivier, and Michael Wolf. "A well-conditioned estimator for large-dimensional covariance matrices." Journal of multivariate analysis 88.2 (2004): 365-411.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
## Not run: 
set.seed(332213)
data(finance)
x <- data.matrix(finance)
#log return
y <- x[-1,-51]
for(i in 2:nrow(x)){
 y[i-1,] <- log(x[i,-51]) - log(x[i-1,-51])
}
#annualize the log return
y <- y * 252 

#first, fit a Gaussian HMM without autoregressive structure
m <- 2
#initialize the list of means
mu <- list(apply(y,2,mean), apply(y,2,mean))
#initialize the list of covariance matrices
sigma <- list(cov(y)*1.2,cov(y)*0.8)
#initialize the prior probability
delta <- c(0.5,0.5)
#initialize the transition probabilities
gamma <- matrix(c(0,1,1,0),2,2,byrow=TRUE)
#initialize the state duration probabilities
d <- list(rep(0.1,10),rep(0.1,10))
mod1 <- list(m=m,mu=mu,sigma=sigma,delta=delta,gamma=gamma,d=d)
#will not run without a shrinkage on the covariance matrices because the 
#series is not long enough to reliably estimate the covariance structure
fit1 <- em.semi(y=y,mod=mod1,cov.shrink=0.0001)
st1 <- viterbi.semi(y=y,mod=fit1)
sp1 <- smooth.semi(y=y,mod=fit1)

#second, fit a Gaussian HSMM with 1st order autoregressive structure
auto <- list(matrix(rep(0,2500),50,50,byrow=TRUE),
             matrix(rep(0,2500),50,50,byrow=TRUE))
mod2 <- list(m=m,mu=mu,sigma=sigma,delta=delta,gamma=gamma,auto=auto,
             d=d,arp=1)
#increase auto.lambda to enforce stronger regularization for model to run
fit2 <- em.semi(y=y,mod=mod2,cov.shrink=0.001,arp=1,
               auto.alpha=0.8,auto.lambda=10)
sum(fit2$auto[[1]]==0)
sum(fit2$auto[[2]]==0)
st2 <- viterbi.semi(y=y,mod=fit2)
sp2 <- smooth.semi(y=y,mod=fit2)

#third, fit a Gaussian HSMM with 2nd order autoregressive structure
auto <- list(matrix(rep(0,5000),50,100,byrow=TRUE),
             matrix(rep(0,5000),50,100,byrow=TRUE))
mod3 <- list(m=m,mu=mu,sigma=sigma,delta=delta,gamma=gamma,auto=auto,
             d=d,arp=2)
#increase auto.lambda to enforce stronger regularization for model to run
fit3 <- em.semi(y=y,mod=mod3,ntimes=NULL,cov.shrink=0.0001,arp=2,
               auto.alpha=0.8,auto.lambda=30)
sum(fit3$auto[[1]]==0)
sum(fit3$auto[[2]]==0)
st3 <- viterbi.semi(y=y,mod=fit3)
sp3 <- smooth.semi(y=y,mod=fit3)

## End(Not run)

rarhsmm documentation built on May 2, 2019, 9:33 a.m.

Related to em.semi in rarhsmm...