Hey folks
For the next session we will be focusing on Discrete HMMs. The next session will be live session will be on Sunday the 25th
Our goal is to fully understand this formula
\begin{align}
p(y_{1:T}, z_{1:T} \mid \theta)
&= \mathrm{Cat}(z_1 \mid \pi)
\prod_{t=2}^T \mathrm{Cat}(z_t \mid A_{z_{t-1}})
\prod_{t=1}^T \mathrm{Cat}(y_t \mid B_{z_t})
\end{align}
We’ll do that with the following reading material
- Casino HMMs from Dynamax
- Complete reading of Exploring Hidden Markov Models
Focus on
- Filtering (forwards algorithm)
- Smoothing (forwards-backwards algorithm)
- Most likely state sequence (Viterbi algorithm)
As always this is just my prior. If you have opinions or suggestions please do leave them below!