Available:*
Library | Item Barcode | Call Number | Material Type | Item Category 1 | Status |
---|---|---|---|---|---|
Searching... | 30000010226352 | QA280 Z83 2009 | Open Access Book | Book | Searching... |
Searching... | 30000010117592 | QA280 Z83 2009 | Open Access Book | Book | Searching... |
On Order
Summary
Summary
Reveals How HMMs Can Be Used as General-Purpose Time Series Models
Implements all methods in R
Hidden Markov Models for Time Series: An Introduction Using Rapplies hidden Markov models (HMMs) to a wide range of time series types, from continuous-valued, circular, and multivariate series to binary data, bounded and unbounded counts, and categorical observations. It also discusses how to employ the freely available computing environment R to carry out computations for parameter estimation, model selection and checking, decoding, and forecasting.
Illustrates the methodology in action
After presenting the simple Poisson HMM, the book covers estimation, forecasting, decoding, prediction, model selection, and Bayesian inference. Through examples and applications, the authors describe how to extend and generalize the basic model so it can be applied in a rich variety of situations. They also provide R code for some of the examples, enabling the use of the codes in similar applications.
Effectively interpret data using HMMs
This book illustrates the wonderful flexibility of HMMs as general-purpose models for time series data. It provides a broad understanding of the models and their uses.
Table of Contents
Preface | p. xvii |
Notation and abbreviations | p. xxi |
Part 1 Model structure, properties and methods | p. 1 |
1 Preliminaries: mixtures and Markov chains | p. 3 |
1.1 Introduction | p. 3 |
1.2 Independent mixture models | p. 6 |
1.2.1 Definition and properties | p. 6 |
1.2.2 Parameter estimation | p. 9 |
1.2.3 Unbounded likelihood in mixtures | p. 10 |
1.2.4 Examples of fitted mixture models | p. 11 |
1.3 Markov chains | p. 15 |
1.3.1 Definitions and example | p. 16 |
1.3.2 Stationary distributions | p. 18 |
1.3.3 Reversibility | p. 19 |
1.3.4 Autocorrelation function | p. 19 |
1.3.5 Estimating transition probabilities | p. 20 |
1.3.6 Higher-order Markov chains | p. 22 |
Exercises | p. 24 |
2 Hidden Markov models: definition and properties | p. 29 |
2.1 A simple hidden Markov model | p. 29 |
2.2 The basics | p. 30 |
2.2.1 Definition and notation | p. 30 |
2.2.2 Marginal distributions | p. 32 |
2.2.3 Moments | p. 34 |
2.3 The likelihood | p. 35 |
2.3.1 The likelihood of a two-state Bernoulli-HMM | p. 35 |
2.3.2 The likelihood in general | p. 37 |
2.3.3 The likelihood when data are missing at random | p. 39 |
2.3.4 The likelihood when observations are interval-censored | p. 40 |
Exercises | p. 41 |
3 Estimation by direct maximization of the likelihood | p. 45 |
3.1 Introduction | p. 45 |
3.2 Scaling the likelihood computation | p. 46 |
3.3 Maximization subject to constraints | p. 47 |
3.3.1 Reparametrization to avoid constraints | p. 47 |
3.3.2 Embedding in a continuous-time Markov chain | p. 49 |
3.4 Other problems | p. 49 |
3.4.1 Multiple maxima in the likelihood | p. 49 |
3.4.2 Starting values for the iterations | p. 50 |
3.4.3 Unbounded likelihood | p. 50 |
3.5 Example: earthquakes | p. 50 |
3.6 Standard errors and confidence intervals | p. 53 |
3.6.1 Standard errors via the Hessian | p. 53 |
3.6.2 Bootstrap standard erros and confidence intervals | p. 55 |
3.7 Example: parametric bootstrap | p. 55 |
Exercises | p. 57 |
4 Estimation by the EM algorithm | p. 59 |
4.1 Forward and backward probabilities | p. 59 |
4.1.1 Forward probabilities | p. 60 |
4.1.2 Backward probabilities | p. 61 |
4.1.3 Properties of forward and backward probabilities | p. 62 |
4.2 The EM algorithm | p. 63 |
4.2.1 EM in general | p. 63 |
4.2.2 EM for HMMs | p. 64 |
4.2.3 M step for Poisson-and normal-HMMs | p. 66 |
4.2.4 Starting from a specified state | p. 67 |
4.2.5 EM for the case in which the Markov chain is stationary | p. 67 |
4.3 Examples of EM applied to Poisson-HMMs | p. 68 |
4.3.1 Earthquakes | p. 68 |
4.3.2 Foetal movement counts | p. 70 |
4.4 Discussion | p. 72 |
Exercises | p. 73 |
5 Forecasting, decoding and state prediction | p. 75 |
5.1 Conditional distributions | p. 76 |
5.2 Forecast distributions | p. 77 |
5.3 Decoding | p. 80 |
5.3.1 State probabilities and local decoding | p. 80 |
5.3.2 Global decoding | p. 82 |
5.4 State prediction | p. 86 |
Exercises | p. 87 |
6 Model selection and checking | p. 89 |
6.1 Model selection by AIC and BIC | p. 89 |
6.2 Model checking with pseudo-residuals | p. 92 |
6.2.1 Introducing pseudo-residuals | p. 93 |
6.2.2 Ordinary pseudo-residuals | p. 96 |
6.2.3 Forecast pseudo-residuals | p. 97 |
6.3 Examples | p. 98 |
6.3.1 Ordinary pseudo-residuals for the earthquakes | p. 98 |
6.3.2 Dependent ordinary pseudo-residuals | p. 98 |
6.4 Discussion | p. 100 |
Exercises | p. 101 |
7 Bayesian inference for Poisson-HMMs | p. 103 |
7.1 Applying the Gibbs sampler to Poisson-HMMs | p. 103 |
7.1.1 Generating sample paths of the Markov chain | p. 105 |
7.1.2 Decomposing observed counts | p. 106 |
7.1.3 Updating the parameters | p. 106 |
7.2 Bayesian estimation of the number of states | p. 106 |
7.2.1 Use of the integrated likelihood | p. 107 |
7.2.2 Model selection by parallel sampling | p. 108 |
7.3 Example: earthquakes | p. 108 |
7.4 Discussion | p. 110 |
Exercises | p. 112 |
8 Extensions of the basic hidden Markov model | p. 115 |
8.1 Introduction | p. 115 |
8.2 HMMs with general univariate state-dependent distribution | p. 116 |
8.3 HMMs based on a second-order Markov chain | p. 118 |
8.4 HMMs for multivariate series | p. 119 |
8.4.1 Series of multinomial-like observations | p. 119 |
8.4.2 A model for categorical series | p. 121 |
8.4.3 Other multivariate models | p. 122 |
8.5 Series that depend on covariates | p. 125 |
8.5.1 Covariates in the state-dependent distributions | p. 125 |
8.5.2 Covariates in the transition probabilities | p. 126 |
8.6 Models with additional dependencies | p. 128 |
Exercises | p. 129 |
Part 2 Applications | p. 133 |
9 Epileptic seizures | p. 135 |
9.1 Introduction | p. 135 |
9.2 Models fitted | p. 135 |
9.3 Model checking by pseudo-residuals | p. 138 |
Exercises | p. 140 |
10 Eruptions of the Old Faithful geyser | p. 141 |
10.1 Introduction | p. 141 |
10.2 Binary time series of short and long eruptions | p. 141 |
10.2.1 Markov chain models | p. 142 |
10.2.2 Hidden Markov models | p. 144 |
10.2.3 Comparison of models | p. 147 |
10.2.4 Forecast distributions | p. 148 |
10.3 Normal-HMMs for durations and waiting times | p. 149 |
10.4 Bivariate model for durations and waiting times | p. 152 |
Exercises | p. 153 |
11 Drosophila speed and change of direction | p. 155 |
11.1 Introduction | p. 155 |
11.2 Von Mises distributions | p. 156 |
11.3 Von Mises-HMMs for the two subjects | p. 157 |
11.4 Circular autocorrelation functions | p. 158 |
11.5 Bivariate model | p. 161 |
Exercises | p. 165 |
12 Wind direction at Koeberg | p. 167 |
12.1 Introduction | p. 167 |
12.2 Wind direction classified into 16 categories | p. 167 |
12.2.1 Three HMMs for hourly averages of wind direction | p. 167 |
12.2.2 Model comparisons and other possible models | p. 170 |
12.2.3 Conclusion | p. 173 |
12.3 Wind direction as a circular variable | p. 174 |
12.3.1 Daily at hour 24: von Mises-HMMs | p. 174 |
12.3.2 Modelling hourly change of direction | p. 176 |
12.3.3 Transition probabilities varying with lagged speed | p. 176 |
12.3.4 Concentration parameter varying with lagged speed | p. 177 |
Exercises | p. 180 |
13 Models for financial series | p. 181 |
13.1 Thinly traded shares | p. 181 |
13.1.1 Univariate models | p. 181 |
13.1.2 Multivariate models | p. 183 |
13.1.3 Discussion | p. 185 |
13.2 Multivariate HMM for returns on four shares | p. 186 |
13.3 Stochastic volatility models | p. 190 |
13.3.1 Stochastic volatility models without leverage | p. 190 |
13.3.2 Application: FTSE 100 returns | p. 192 |
13.3.3 Stochastic volatility models with leverage | p. 193 |
13.3.4 Application: TOPIX returns | p. 195 |
13.3.5 Discussion | p. 197 |
14 Births at Edendale Hospital | p. 199 |
14.1 Introduction | p. 199 |
14.2 Models for the proportion Caesarean | p. 199 |
14.3 Models for the total number of deliveries | p. 205 |
14.4 Conclusion | p. 208 |
15 Homicides and suicides in Cape Town | p. 209 |
15.1 Introduction | p. 209 |
15.2 Firearm homicides as a proportion of all homicides, suicides and legal intervention homicides | p. 209 |
15.3 The number of firearm homicides | p. 211 |
15.4 Firearm homicide and suicide proportions | p. 213 |
15.5 Proportion in each of the five categories | p. 217 |
16 Animal behaviour model with feedback | p. 219 |
16.1 Introduction | p. 219 |
16.2 The model | p. 220 |
16.3 Likelihood evaluation | p. 222 |
16.3.1 The likelihood as a multiple sum | p. 223 |
16.3.2 Recursive evaluation | p. 223 |
16.4 Parameter estimation by maximum likelihood | p. 224 |
16.5 Model checking | p. 224 |
16.6 Inferring the underlying state | p. 225 |
16.7 Models for a heterogeneous group of subjects | p. 226 |
16.7.1 Models assuming some parameters to be constant across subjects | p. 226 |
16.7.2 Mixed models | p. 227 |
16.7.3 Inclusion of covariates | p. 227 |
16.8 Other modifications of extensions | p. 228 |
16.8.1 Increasing the number of states | p. 228 |
16.8.2 Changing the nature of the state-dependent distribution | p. 228 |
16.9 Application to caterpillar feeding behaviour | p. 229 |
16.9.1 Date description and preliminary analysis | p. 229 |
16.9.2 Parameter estimates and model checking | p. 229 |
16.9.3 Runlength distributions | p. 233 |
16.9.4 Joint models for seven subjects | p. 235 |
16.10 Discussion | p. 236 |
A Examples of R code | p. 239 |
A.1 Stationary Poisson-HMM, numerical maximization | p. 239 |
A.1.1 Transform natural parameters to working | p. 240 |
A.1.2 Transform working parameters to natural | p. 240 |
A.1.3 Log-likelihood of a stationary Poisson-HMM | p. 240 |
A.1.4 ML estimation of a stationary Poisson-HMM | p. 241 |
A.2 More on Poisson-HMMs, including EM | p. 242 |
A.2.1 Generate a realization of a Poisson-HMM | p. 242 |
A.2.2 Forward and backward probabilities | p. 242 |
A.2.3 EM estimation of a Poisson-HMM | p. 243 |
A.2.4 Viterbi algorithm | p. 244 |
A.2.5 Conditional state probabilities | p. 244 |
A.2.6 Local decoding | p. 245 |
A.2.7 State prediction | p. 245 |
A.2.8 Forecast distributions | p. 246 |
A.2.9 Conditional distribution of one observation given the rest | p. 246 |
A.2.10 Ordinary pseudo-residuals | p. 247 |
A.3 Bivariate normal state-dependent distributions | p. 248 |
A.3.1 Transform natural parameters to working | p. 248 |
A.3.2 Transform working parameters to natural | p. 249 |
A.3.3 Discrete log-likelihood | p. 249 |
A.3.4 MLEs of the parameters | p. 250 |
A.4 Categorical HMM, constrained optimization | p. 250 |
A.4.1 Log-likelihood | p. 251 |
A.4.2 MLEs of the parameters | p. 252 |
B Some proofs | p. 253 |
B.1 Factorization needed for forward probabilities | p. 253 |
B.2 Two results for backward probabilites | p. 255 |
B.3 Conditional independence of Xt1 and $$ | p. 256 |
References | p. 257 |
Author index | p. 267 |
Subject index | p. 271 |