# viterbi.semi: Viterbi algorithm to decode the latent states for Gaussian... In rarhsmm: Regularized Autoregressive Hidden Semi Markov Model

## Description

Viterbi algorithm to decode the latent states for Gaussian hidden semi-Markov model with / without autoregressive structures

## Usage

 `1` ```viterbi.semi(y, mod) ```

## Arguments

 `y` observed series `mod` list consisting the at least the following items: mod\$m = scalar number of states, mod\$delta = vector of initial values for prior probabilities, mod\$gamma = matrix of initial values for state transition probabilies. mod\$mu = list of initial values for means, mod\$sigma = list of initial values for covariance matrices. mod\$d = list of state duration probabilities. For autoregressive hidden markov models, we also need the additional items: mod\$arp = scalar order of autoregressive structure mod\$auto = list of initial values for autoregressive coefficient matrices

## Value

a list containing the decoded states

## References

Rabiner, Lawrence R. "A tutorial on hidden Markov models and selected applications in speech recognition." Proceedings of the IEEE 77.2 (1989): 257-286.

## Examples

 ``` 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22``` ```set.seed(135) m <- 2 mu <- list(c(3,4,5),c(-2,-3,-4)) sigma <- list(diag(1.3,3), matrix(c(1,-0.3,0.2,-0.3,1.5,0.3,0.2,0.3,2),3,3,byrow=TRUE)) delta <- c(0.5,0.5) gamma <- matrix(c(0,1,1,0),2,2,byrow=TRUE) auto <- list(matrix(c(0.3,0.2,0.1,0.4,0.3,0.2, -0.3,-0.2,-0.1,0.3,0.2,0.1, 0,0,0,0,0,0),3,6,byrow=TRUE), matrix(c(0.2,0,0,0.4,0,0, 0,0.2,0,0,0.4,0, 0,0,0.2,0,0,0.4),3,6,byrow=TRUE)) d <- list(c(0.5,0.3,0.2),c(0.6,0.4)) mod <- list(m=m,mu=mu,sigma=sigma,delta=delta,gamma=gamma, auto=auto,arp=2,d=d) sim <- hsmm.sim(2000,mod) y <- sim\$series state <- sim\$state fit <- em.semi(y=y, mod=mod, arp=2) state_est <- viterbi.semi(y=y,mod=fit) sum(state_est!=state) ```

rarhsmm documentation built on May 2, 2019, 9:33 a.m.