Description Usage Arguments Details Value Author(s) References See Also Examples

Uses a combination of Viterbi training and Baum-Welch algorithm to estimate parameters for a hidden Markov model.

1 2 |

`hmm` |
Object of class |

`data` |
A |

`max.iter` |
Maximum number of iterations (see Details). |

`eps` |
Minimum change in log-likelihood between iterations (see Details). |

`...` |
Additional arguments to be passed to |

`verbose` |
Level of verbosity. Higher numbers produce more status messages. |

The values of arguments `max.iter`

and `eps`

can have either one or two elements. In the latter case
the first element is used for `viterbiTraining`

and the second one for `baumWelch`

.

Additional arguments can be passed to `viterbiTraining`

and `baumWelch`

by using arguments of the form
`viterbi = list(a = a.value)`

and `baumWelch = list(b = b.value)`

respectively. All other arguments are
passed on to both functions.

An object of class `hmm`

with optimised parameter estimates.

Peter Humburg

Humburg, P. and Bulger, D. and Stone, G. Parameter estimation for robust HMM analysis of ChIP-chip data. BMC Bioinformatics 2008, 9:343

`baumWelch`

, `viterbiTraining`

, `hmm.setup`

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 | ```
## create two state HMM with t distributions
state.names <- c("one","two")
transition <- c(0.035, 0.01)
location <- c(1, 2)
scale <- c(1, 1)
df <- c(4, 6)
hmm1 <- getHMM(list(a=transition, mu=location, sigma=scale, nu=df),
state.names)
## generate observation sequences from model
obs.lst <- list()
for(i in 1:50) obs.lst[[i]] <- sampleSeq(hmm1, 100)
## fit an HMM to the data (with fixed degrees of freedom)
hmm2 <- hmm.setup(obs.lst, state=c("one","two"), df=5)
hmm2.fit <- viterbiEM(hmm2, obs.lst, max.iter=c(5,15), verbose=2, df=5)
``` |

Embedding an R snippet on your website

Add the following code to your website.

For more information on customizing the embed code, read Embedding Snippets.