Top Banner
State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser
26

State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Dec 14, 2015

Download

Documents

Marilyn Deemer
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

State Estimation and Kalman FilteringCS B659Spring 2013Kris Hauser

Page 2: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Motivation• Observing a stream of data

• Monitoring (of people, computer systems, etc)

• Surveillance, tracking• Finance & economics• Science

• Questions:• Modeling & forecasting• Handling partial and noisy

observations

Page 3: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Markov Chains

• Sequence of probabilistic state variables X0,X1,X2,…• E.g., robot’s position, target’s position and velocity, …

X0 X1 X2 X3

Observe X1

X0 independent of X2, X3, …

P(Xt|Xt-1) known as transition model

Page 4: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.
Page 5: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Inference in MC• Prediction: the probability of future state?

• P(Xt) = Sx0,…,xt-1P (X0,…,Xt)

= Sx0,…,xt-1P (X0) Px1,…,xt P(Xi|Xi-1)

= Sxt-1P(Xt|Xt-1) P(Xt-1)

• “Blurs” over time, and approaches stationary distribution as t grows• Limited prediction power• Rate of blurring known as mixing time

[Incremental approach]

Page 6: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Modeling Partial Observability• Hidden Markov Model (HMM)

X0 X1 X2 X3

O1 O2 O3

Hidden state variables

Observed variables

P(Ot|Xt) called the observation model (or sensor model)

Page 7: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Filtering• Name comes from signal processing• Goal: Compute the probability distribution over current state

given observations up to this point

X0 X1 X2

O1 O2

Query variable

Known

Distribution given

Unknown

Page 8: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Filtering• Name comes from signal processing• Goal: Compute the probability distribution over current state

given observations up to this point

• P(Xt|o1:t) = Sxt-1 P(xt-1|o1:t-1) P(Xt|xt-1,ot)

• P(Xt|Xt-1,ot) = P(ot|Xt-1,Xt)P(Xt|Xt-1)/P(ot|Xt-1)= a P(ot|Xt)P(Xt|Xt-1)

X0 X1 X2

O1 O2

Query variable

Known

Distribution given

Unknown

Page 9: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Kalman Filtering• In a nutshell

• Efficient probabilistic filtering in continuous state spaces

• Linear Gaussian transition and observation models

• Ubiquitous for state tracking with noisy sensors, e.g. radar, GPS, cameras

Page 10: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Hidden Markov Model for Robot Localization• Use observations + transition dynamics to get a better idea of

where the robot is at time t

X0 X1 X2 X3

z1 z2 z3

Hidden state variables

Observed variables

Predict – observe – predict – observe…

Page 11: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Hidden Markov Model for Robot Localization• Use observations + transition dynamics to get a better idea of

where the robot is at time t• Maintain a belief state bt over time

• bt(x) = P(Xt=x|z1:t)

X0 X1 X2 X3

z1 z2 z3

Hidden state variables

Observed variables

Predict – observe – predict – observe…

Page 12: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Bayesian Filtering with Belief States• Compute bt, given zt and prior belief bt

• Recursive filtering equation

Page 13: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Update via the observation ztPredict P(Xt|z1:t-1) using dynamics alone

Bayesian Filtering with Belief States• Compute bt, given zt and prior belief bt

• Recursive filtering equation

Page 14: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

In Continuous State Spaces…

• Compute bt, given zt and prior belief bt

• Continuous filtering equation

Page 15: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

In Continuous State Spaces…

• Compute bt, given zt and prior belief bt

• Continuous filtering equation

• How to evaluate this integral?• How to calculate Z?• How to even represent a belief state?

Page 16: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Key Representational Decisions• Pick a method for representing distributions

• Discrete: tables• Continuous: fixed parameterized classes vs. particle-based

techniques• Devise methods to perform key calculations (marginalization,

conditioning) on the representation• Exact or approximate?

Page 17: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Gaussian Distribution• Mean m, standard deviation s• Distribution is denoted N(m,s)• If X ~ N(m,s), then

• With a normalization factor

Page 18: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Linear Gaussian Transition Model for Moving 1D Point

• Consider position and velocity xt, vt

• Time step h• Without noise

xt+1 = xt + h vt

vt+1 = vt

• With Gaussian noise of std s1

P(xt+1|xt) exp(-(xt+1 – (xt + h vt))2/(2s12)

i.e. Xt+1 ~ N(xt + h vt, s1)

Page 19: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Linear Gaussian Transition Model• If prior on position is Gaussian, then the posterior is also

Gaussian

vh s1

N(m,s) N(m+vh,s+s1)

Page 20: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Linear Gaussian Observation Model

• Position observation zt

• Gaussian noise of std s2

zt ~ N(xt,s2)

Page 21: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Linear Gaussian Observation Model• If prior on position is Gaussian, then the posterior is also

Gaussian

m (s2z+s22m)/(s2+s2

2)

s2 s2s22/(s2+s2

2)

Position prior

Posterior probability

Observation probability

Page 22: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Multivariate Gaussians• Multivariate analog in N-D space• Mean (vector) m, covariance (matrix) S

• With a normalization factor

X ~ N(m,S)

Page 23: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Multivariate Linear Gaussian Process• A linear transformation + multivariate Gaussian noise• If prior state distribution is Gaussian, then posterior state

distribution is Gaussian• If we observe one component of a Gaussian, then its posterior

is also Gaussian

y = A x + e e ~ N(m,S)

Page 24: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Multivariate Computations• Linear transformations of gaussians

• If x ~ N(m,S), y = A x + b• Then y ~ N(Am+b, ASAT)

• Consequence• If x ~ N(mx,Sx), y ~ N(my,Sy), z=x+y• Then z ~ N(mx+my,Sx+Sy)

• Conditional of gaussian• If [x1,x2] ~ N([m1 m2],[S11,S12;S21,S22])• Then on observing x2=z, we have

x1 ~ N(m1-S12S22-1(z-m2), S11-S12S22

-1S21)

Page 25: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Presentation

Page 26: State Estimation and Kalman Filtering CS B659 Spring 2013 Kris Hauser.

Next time• Principles Ch. 9• Rekleitis (2004)