Description Usage Arguments Details Value See Also Examples
Estimation of the transition probabilites, the initial state probabilites and the hidden state parameters of a Hidden Markov Model by using the Direct Maximisation of the likelihood or the Baum-Welch Algorithm
1 2 |
x |
sample of a Hidden Markov Model |
m |
the number of states |
method |
choose between two different methods: "DM" as default, alternative "EM" |
L1 |
likelihood of the first hidden state |
L2 |
likelihood of the second hidden state |
L3 |
optional. likelihood of the third hidden state |
L4 |
optional. likelihood of the 4th hidden state |
L5 |
optional. likelihood of the 5th hidden state |
iterations |
optional. number of iterations for the EM-Algorithm |
DELTA |
optional. stop criterion for the EM-Algorithm |
decoding |
if parameter set TRUE the function returns the most probable paths via local and global decoding |
This package is designed to estimate the hidden states of a HMM-Model, given the underlying likelihoods of each state. It is important to support at least two likelihoods (L1, L2) for the function, which both depend on one unknown parameter Theta. See examples for a suitable structur of the likelihoods.
Choose with "method" the underlying estimation function. If DM is selected, the HMM-function will estimate the parameters via a direct maximisation of the given likelihoods. If EM is selected the HMM-function will use a Baum-Welch estimation algorithm to compute the different states and the estimation of the underlying parameters.
For more detailed explanation we recommend the source Hidden Markov Models for Times Series by Walter Zucchini, Iain MacDonald & Roland Langrock.
The underlying functions are the HMM_EM for the EM-Algorithm and the HMM_DM for the Direct Maximisation
Returns the Delta vector, Gamma matrix and the Thetas of the Likelihoods rounded by three decimals. If "EM" is selected the function also returns the number of iterations and the DELTA.
For Hidden Markov Models with multiple thetas in their Likelihood,
please refer to multi_HMM
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 | ################
#First: Generating the sample of the HMM with the true following true values:
#transition matrix
gamma <- matrix(c(0.9, 0.05, 0.05,
0.1, 0.4, 0.5,
0.3, 0.5, 0.2), byrow = TRUE, nrow = 3)
#initial state probabilities
delta <- c(0.5, 0.3, 0.2)
#sample size
n <- 500
#number of states
m <- 3
#sampling from normal distribution with different mu's but same sigma's:
x <- c()
set.seed(100)
s1 <- rnorm(10000, 7, 1)
s2 <- rnorm(10000, 2, 1)
s3 <- rnorm(10000, 12, 1)
#initial state
random_number <- runif(1, 0, 1)
if (random_number < delta[1]){
x[1] <- sample(s1, 1, replace = FALSE)
p <- 1
} else if (random_number < sum(delta[1:2]) && random_number > delta[1]) {
x[1] <- sample(s2, 1, replace = FALSE)
p <- 2
} else {
x[1] <- sample(s3, 1, replace = FALSE)
p <- 3
}
#sample creation
for (i in 2:n){
random_number <- runif(1, 0, 1)
if (random_number < gamma[p,1]){
p <- 1
x[i] <- sample(s1, 1, replace = FALSE)
} else if(random_number < sum(gamma[p,1:2]) && random_number > gamma[p,1]) {
p <- 2
x[i] <- sample(s2, 1, replace = FALSE)
} else {
p <- 3
x[i] <- sample(s3, 1, replace = FALSE)
}
}
#Display of the sample
hist(x)
################
#Second: Defining the likelihoods.
#likelihoods
L1 <- function(x, mu){
p1 <- 1/sqrt(2*pi) * exp(-0.5*(x-mu)^2)
return(p1)
}
L2 <- function(x, mu){
p2 <- 1/sqrt(2*pi) * exp(-0.5*(x-mu)^2)
return(p2)
}
L3 <- function(x, mu){
p3 <- 1/sqrt(2*pi) * exp(-0.5*(x-mu)^2)
return(p3)
}
################
#Third: Executing the two HMM functions
HMM(x = x, m = m, method = "EM", L1 = L1, L2 = L2, L3 = L3, decoding = TRUE)
HMM(x = x, m = m, method = "DM", L1 = L1, L2 = L2, L3 = L3, decoding = TRUE)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.