Description Usage Arguments Value
Fit mixture model or a hidden markov model
1 2 3 4 5 |
counts |
matrix of non-negative integers. Columns represent datapoints and rows dimensions |
k |
either the desired number of cluster, or a specific initial
value for the models (mixture components or emission probabilities).
See the item |
framework |
Switches between a mixture model and a hidden markov model. The default is a hidden markov model, where the order of the datapoints matters. |
mix_coeff |
In the |
trans |
In the |
initP |
In the |
tol |
Tolerance value used to determine convergence of the EM algorithm. The algorithm will converge when the absolute difference in the log-likelihood between two iterations will fall below this value. |
maxiter |
maximum number of iterations in the EM algorithm. Use 0 if you don't want to do any training iteration. |
nthreads |
number of threads used. The backward-forward step in the HMM learning cannot use more threads than the number of sequences. |
nbtype |
type of training for the negative binomial. Accepted types are:
|
init |
Initialization scheme for the models (mixture components or emission
probabilities). The value |
init.nlev |
Tuning parameter for the initialization schemes |
verbose |
print some output during execution |
seqlens |
Length of each sequence of observations. The number of columns
of the count matrix should equal |
split4speed |
Add artificial breaks to speed-up the forward-backward
algorithm. If |
a list with, among other, the following parameters:
models |
a list containing the parameters of each model
(mixture components or emission probabilities). Each element of
the list describes a negative multinomial distribution.
This is specified in another list with items |
loglik |
the log-likelihood of the whole dataset. |
posteriors |
A matrix of size |
states |
An integer vector of length |
converged |
|
llhistory |
time series containing the log-likelihood of the whole dataset across iterations |
viterbi |
In HMM mode, the viterbi path an its likelihood as a list. |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.