next up previous
Next: Time-series cleaning: HMM-clean Up: Hidden Markov Models, etc. Previous: Simple Gaussian HMM


Autoregressive observations: HMM-AR

Conditional on the hidden variables $ s=\{s_0,\dots,s_T\}$ the observations $ x=\{x_1,\dots,x_T\}$ in the simple HMM, eq. (2), are independent. For time series modeling this independency is sometimes inadequate. An obvious extension is shown in Fig. 7. The $ \psi$ function now becomes

$\displaystyle \psi(s_t,x_t,s_{t-1},x_{t-1}) = p(x_t\vert x_{t-1},s_t)\,p(s_t\vert s_{t-1})$ (31)

and the joint distribution is

$\displaystyle p(xs) = \prod_{t=1}^T \psi(s_t,x_t,s_{t-1},x_{t-1})$ (32)

where we define $ \psi(s_1,x_1,s_0,x_0)=\tilde p(x_1\vert s_1)\,p(s_1\vert s_0)\,p(s_0)$ . Again, we have forward and backward recursions for the functions
$\displaystyle \alpha_t(s)$ $\displaystyle =$ $\displaystyle p(x^t,s_t=s)$ (33)
$\displaystyle \beta_t(s)$ $\displaystyle =$ $\displaystyle p(x_{t+1}\dots x_T\vert s_t=s,x_t)$ (34)

Fig.: HMM-AR conditional dependencies.
Image HMM-ar



Markus Mayer 2009-06-22