Top Banner
Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)
29

Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Dec 13, 2015

Download

Documents

Erika Ellis
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Learning to Detect Events with Markov-Modulated Poisson

Processes

Ihler, Hutchins and Smyth (2007)

Page 2: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Outline

Problem: Finding unusual activity (events) in rhythms of natural human activity

Method: Unsupervised learning Time-varying Poisson process modulated by a hidden

Markov process (events) Bayesian framework for parameter learning

Page 3: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Why is it hard?

Chicken-and-egg problem Where do we start?

Previous approaches: baseline Simple threshold model Has severe limitations

Need to quantify the notion of an unusual activity

How unusual is a measurement How persistent is a deviating measurement

Page 4: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

The Data Sets 2 data sets used

Building data Counts of people entering and exiting a building 15 weeks of data 30 minute time bins 29 known events in the 15 weeks

Freeway Traffic data Vehicle counts on a freeway on-ramp 6 months of data 5 minute time bins 78 known events in the 6 months

Page 5: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Building Data

Example day

Page 6: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Building Data

Example week

Page 7: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Freeway Traffic Data

Example day

Page 8: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Freeway Traffic Data

Example week

Page 9: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

A naïve Poisson model

Is the data actually Poisson? In a Poisson distribution the mean = the variance Is this the case in out data?

Page 10: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

A Baseline Model

Use a simple threshold approach We say there is an event if

P(N;λ) < ε

Page 11: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Problems with this Approach

Hard to detect sustained small variation Hard to capture event duration Chicken and egg problem

Page 12: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

The model (1)

Assuming the processes are additive

...which is a fair assumption

Page 13: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

The model (2)

Page 14: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

What is a Markov Process?

0.1

0.5

A = Rainy B = Sunny

Page 15: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Modelling Events with a Markov Process

We define a three state Markov chain z(t) is the state at time t, the 3 possible states are

0 if there is no event +1 if there is a positive event -1 if there is a negative even

With transition matrix

Page 16: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Details of the Markov Process

We give each row in the transition matrix a Dirichlet prior:

Given z(t), we can model NE(t) as a Poisson

with rate γ(t). We give this a Gamma prior Γ(γ;aE,bE), which is independent of t

We can then marginalize out over γ(t):

Page 17: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Graphical Model of the Dependencies

Page 18: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Learning the parameters

If we are given the hidden variables N0(t), N

E(t)

and z(t), we can: compute MAP estimates draw posterior samples of the parameters

λ(t) and Mz

So, we can use MCMC; iterate between sampling from the hidden variables (given the parameters), and the parameters (given the variables)

Page 19: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Sampling the hidden variables, given the parameters

Rough outline: First, use forward-backward algorithm [Baum et

al. 1970] to sample z(t) Then given z(t), determine N

0(t) and N

E(t) by

sampling

Page 20: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Sampling the parameters, given the hidden variables

The conjugate prior distributions give us a straightforward way to compute the posteriors

Use the sufficient statistics of the data as (updating) parameters for the posterior:

Page 21: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Prior distributions of zij and γ(t)

Markov-modulated Poisson processes are sensitive to selection of priors for z

ij and γ(t)

For the domains of these models, we often have strong ideas on e.g. what constitutes a “rare” event

Use these ideas to build strong priors in the model in order to avoid overfitting, and to adjust threshold levels of event detection

Page 22: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Calculating Results

We are looking to detect unusual events, we can use our model to do this do this by calculating the posterior:

We can then compare our predictions with the known event occurrences

Page 23: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Example Posterior Predictions (1)

Page 24: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Example Posterior Predictions (2)

Page 25: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Example Posterior Predictions (3)

Page 26: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Comparison of Predicted Events with Known Events

Page 27: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Other Possible Inferences

The model can be modified to test the degree of heterogeneity of the time process. We can ask questions like

are all week days essentially the same? are all afternoons essentially the same?

We can estimate event attendance

Page 28: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Conclusion

Model much more affective than threshold approach

Good detection rate Difficult to access false positive rate

Possibility for extension

Page 29: Learning to Detect Events with Markov-Modulated Poisson Processes Ihler, Hutchins and Smyth (2007)

Questions