Description Details Author(s) References See Also Examples
The package includes the functions designed to analyse continuous observations processes with the Hidden Markov Model approach. They include Baum-Welch and Viterbi algorithms and additional visualisation functions. The observations are assumed to have Gaussian distribution and to be weakly stationary processes. The package was created for analyses of financial time series, but can also be applied to any continuous observations processes.
Package: | HMMCont |
Type: | Package |
Version: | 1.0 |
Date: | 2014-02-11 |
License: | GPL 3 |
A Hidden Markov Model (HMM) is a statistical model in which the process being modelled is assumed to be a Markov process with unobserved, i.e. hidden states. This unobserved Markov process can be revealed from an observable process that is dependent on the states of the underlying Markov process. The HMMCont package compiles the functions that can analyse the continuous observable processes (i.e. continuous in space, discrete in time) and identify the underlying two-states Markov processes. The observable process should be weakly stationary (e.g. in case of financial time series the returns, but not the prices should be analysed). The state-dependent probabilities of the observations are modelled with Gaussian probability density functions (Rabiner, 1989). The implemented analysis procedure includes: (i) setting the initial model parameters and loading the data (function hmmsetcont
), repeated execution of the Baum-Welch algorithm (function baumwelchcont
), and execution of the Viterbi algorithm (viterbicont
). The function baumwelchcont
allows to control the model parameters after each Baum-Welch iteration, and accumulates the information on the model evolution. The model object can be analysed with tailored print
, summary
, and plot
functions (S3 methods). For details on HMMs see the publications by Viterbi (1967), Baum et al (1970), and Rabiner (1989).
Mikhail A. Beketov
Maintainer: Mikhail A. Beketov <mikhail.beketov@gmx.de>
Baum, L.E., Petrie, T., Soules, G., and Weiss, N. 1970. A maximization technique occurring in the statistical analysis of probabilistic functions of Markov chains. The Annals of Mathematical Statistics. 41: 164-171.
Rabiner, L.R. 1989. A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE. 77: 257-286.
Viterbi, A.J. 1967. Error bounds for convolutional codes and an asymptotically optimum decoding algorithm. IEEE Transactions on Information Theory. 13: 260-269.
Functions: hmmsetcont
,
baumwelchcont
,
viterbicont
,
statesDistributionsPlot
, and
logreturns
.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 | # Step-by-step analysis example.
Returns<-logreturns(Prices) # Getting a stationary process
Returns<-Returns*10 # Scaling the values
hmm<-hmmsetcont(Returns) # Creating a HMM object
print(hmm) # Checking the initial parameters
for(i in 1:6){hmm<-baumwelchcont(hmm)} # Baum-Welch is
# executed 6 times and results are accumulated
print(hmm) # Checking the accumulated parameters
summary(hmm) # Getting more detailed information
hmmcomplete<-viterbicont(hmm) # Viterbi execution
statesDistributionsPlot(hmmcomplete, sc=10) # PDFs of
# the whole data set and two states are plotted
par(mfrow=c(2,1))
plot(hmmcomplete, Prices, ylabel="Price")
plot(hmmcomplete, ylabel="Returns") # the revealed
# Markov chain and the observations are plotted
|
The number of Baum-Welch iterations: 0
The parameters accumulated so far:
Pi1 Pi2 A11 A12 A21 A22 Mu1 Mu2 Var1 Var2
[1,] 0.5 0.5 0.7 0.3 0.3 0.7 5 -5 10 10
The results accumulated so far:
P AIC SBIC
The Viterbi algorithm was not yet executed
The number of Baum-Welch iterations: 6
The parameters accumulated so far:
Pi1 Pi2 A11 A12 A21 A22 Mu1 Mu2 Var1 Var2
[1,] 0.50 0.50 0.70 0.30 0.30 0.70 5.00 -5.00 10.00 10.00
[2,] 0.52 0.48 0.71 0.29 0.31 0.69 0.12 -0.09 0.16 0.24
[3,] 0.52 0.48 0.73 0.27 0.31 0.69 0.13 -0.12 0.12 0.28
[4,] 0.48 0.52 0.77 0.23 0.30 0.70 0.14 -0.15 0.10 0.31
[5,] 0.37 0.63 0.81 0.19 0.28 0.72 0.14 -0.17 0.08 0.33
[6,] 0.21 0.79 0.84 0.16 0.25 0.75 0.14 -0.18 0.07 0.35
[7,] 0.08 0.92 0.87 0.13 0.22 0.78 0.14 -0.17 0.07 0.36
The results accumulated so far:
P AIC SBIC
[1,] 2.717139e-240 1123.2417 1205.6016
[2,] 7.527195e-45 223.1956 305.5555
[3,] 5.698025e-43 214.5421 296.9020
[4,] 2.149158e-41 207.2818 289.6417
[5,] 2.338101e-40 202.5081 284.8680
[6,] 1.225023e-39 199.1957 281.5556
The Viterbi algorithm was not yet executed
The number of observations: 167
The mean of observations: 0.01687374
The SD of observations: 0.4567522
The max and min of observations: 1.023066 and -1.856365
The number of Baum-Welch iterations: 6
The Viterbi algorithm was not yet executed
The parameters accumulated so far:
Pi1 Pi2 A11 A12 A21 A22 Mu1 Mu2 Var1 Var2
[1,] 0.50 0.50 0.70 0.30 0.30 0.70 5.00 -5.00 10.00 10.00
[2,] 0.52 0.48 0.71 0.29 0.31 0.69 0.12 -0.09 0.16 0.24
[3,] 0.52 0.48 0.73 0.27 0.31 0.69 0.13 -0.12 0.12 0.28
[4,] 0.48 0.52 0.77 0.23 0.30 0.70 0.14 -0.15 0.10 0.31
[5,] 0.37 0.63 0.81 0.19 0.28 0.72 0.14 -0.17 0.08 0.33
[6,] 0.21 0.79 0.84 0.16 0.25 0.75 0.14 -0.18 0.07 0.35
[7,] 0.08 0.92 0.87 0.13 0.22 0.78 0.14 -0.17 0.07 0.36
The results accumulated so far:
P AIC SBIC HQIC
[1,] 2.717139e-240 1123.2417 1205.6016 1135.8969
[2,] 7.527195e-45 223.1956 305.5555 235.8509
[3,] 5.698025e-43 214.5421 296.9020 227.1973
[4,] 2.149158e-41 207.2818 289.6417 219.9371
[5,] 2.338101e-40 202.5081 284.8680 215.1634
[6,] 1.225023e-39 199.1957 281.5556 211.8510
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.