Top Banner
Lecture 24: Hidden Markov Models Department of Electrical Engineering Princeton University January 8, 2014 ELE 525: Random Processes in Information Systems Hisashi Kobayashi
29

Lecture 24: Hidden Markov Models - Hisashi Kobayashi

Jan 02, 2022

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

Lecture 24:Hidden Markov Models

Department of Electrical EngineeringPrinceton University

January 8, 2014

ELE 525: Random Processes in Information Systems

Hisashi Kobayashi

Page 2: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 2

In some applications of a Markov chain model, we may not be able to directly observe a sequence of states, or may not even know the structure of the Markov model and the model parameters.

The observable variable may be a probabilistic function of the underlying Markov state. Such a model is called a hidden Markov model (HMM).

In this chapter we study how to estimate states (or a sequence of states) and HMM parameters such as state transition probabilities based on observable data.

Page 3: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 3

Page 4: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 4

In the figure of next slide, we show the state transition diagram of time-homogenous Markov chain, and its trellis diagram.

Page 5: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 5

We denote the state sequence in the period Ƭ defined in (20.2)

Page 6: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 6

There is a , where

We

Page 7: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 7

Page 8: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 8

Note that

Page 9: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 9

Page 10: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 10

But,

We should replace

Page 11: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 11

Page 12: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 12

Page 13: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 13

Page 14: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 14

is

We assume that the channel output Yt has the same alphabet as the channel input (the encoder output) Ot , i.e.,

Page 15: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 15

A discrete memory channel is called a binary symmetric channel (BSC) if

p(1|0)=p(0|1)=Ɛ

Page 16: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 16

The following four basic problem arise when we apply an HMM to characterize a system of our interest.

Page 17: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 17

We can write

Page 18: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 18

or

which suggests a computationally efficient procedure, called the forward recursion algorithm or forward algorithm for calculating of (20.50) .

Page 19: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 19

An alternative derivation of (20.52) can be made by defining the forward variable,

The 2nd line follows, since is a Markov chain, and depends only on .The 3rd line follows, because the HMM should depends only on .

Page 20: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 20

Define the forward vector variable as a row vector:

Then (20.55) can be written as

The computational complexity is merely in contrast to of the direct enumeration method based on (20.50).

Page 21: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 21

Page 22: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 22

Consider the time-reversed version of the above procedure by rearranging (20.52) as

We define the backward variable by

Define the backward vector variable as a column vector

Then we obtain the backward recursion formula:

Page 23: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 23

Problem 20.11 solution:

Page 24: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 24

The computational algorithm based on this recursion is called the forward-backward algorithm (FBA). (Problem 20.13)

Page 25: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 25

Problem 20.13 Solution:

Page 26: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 26

Page 27: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 27

Page 28: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 28

Page 29: Lecture 24: Hidden Markov Models - Hisashi Kobayashi

01/08/2014 Hisashi Kobayashi 29