Top Banner
Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state x k at time t k , accumulated sensor data Z k a priori knowledge: target dynamics models, sensor model, other context prediction: p(x k-1 |Z k-1 ) dynamics model ----------! context p(x k |Z k-1 ) filtering: p(x k |Z k-1 ) sensor data Z k ----------! sensor model p(x k |Z k ) retrodiction: p(x l-1 |Z k ) filtering output ---------- dynamics model p(x l |Z k ) - finite mixture: inherent ambiguity (data, model, road network ) - optimal estimators: e.g. minimum mean squared error (MMSE) - initiation of pdf iteration: multiple hypothesis track extraction Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018
55

Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Apr 30, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Multiple Hypothesis Tracking: Basic IdeaIterative updating of conditional probability densities!

kinematic target state xk

at time t

k

, accumulated sensor data Zk

a priori knowledge: target dynamics models, sensor model, other context

• prediction: p(xk�1|Zk�1

)

dynamics model����������!context

p(xk

|Zk�1)

• filtering: p(xk

|Zk�1)

sensor data Z

k����������!sensor model

p(xk

|Zk

)

• retrodiction: p(xl�1|Zk

)

filtering output ����������dynamics model

p(xl

|Zk

)

� finite mixture: inherent ambiguity (data, model, road network )� optimal estimators: e.g. minimum mean squared error (MMSE)� initiation of pdf iteration: multiple hypothesis track extraction

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 2: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Kalman filter: xk

= (r>k

, r>k

)

>, Zk

= {zk

,Zk�1}

initiation: p(x0

) = N�

x0

; x0|0, P0|0

, initial ignorance: P0|0 ‘large’

prediction: N�

xk�1; x

k�1|k�1, Pk�1|k�1�

dynamics model���������!F

k|k�1,Dk|k�1

N�

xk

; xk|k�1, Pk|k�1

xk|k�1 = F

k|k�1xk�1|k�1

Pk|k�1 = F

k|k�1Pk�1|k�1Fk|k�1>+D

k|k�1

filtering: N�

xk

; xk|k�1, Pk|k�1

current measurement zk�������������!

sensor model: Hk

,Rk

N�

xk

; xk|k, Pk|k

xk|k = x

k|k�1 +Wk|k�1⌫k|k�1, ⌫

k|k�1 = zk

�Hk

xk|k�1

Pk|k = P

k|k�1 �Wk|k�1Sk|k�1Wk|k�1

>, S

k|k�1 = Hk

Pk|k�1Hk

>+R

k

Wk|k�1 = P

k|k�1Hk

>Sk|k�1

�1‘KALMAN gain matrix’

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 3: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Stationary object position from noisy measurements

position: x 2 R, measurements: z

i

with “likelihood”: p(z

i

|x) = N (z

i

;x,�

i

)

each measurement can have its own measurement error �

i

(standard deviation)

Initial knowledge: p(x) = N (x;x

0

,⌃

0

), ⌃

0

� �

1

, flat.

Impact of the first measurement: p(x|z1

) = N (x; z

1

,�

1

)

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 4: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|z1

) = p(z

1

|x) p(x) /R

dx p(z

1

|x) p(x) Bayes

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 5: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|z1

) = p(z

1

|x) p(x) /R

dx p(z

1

|x) p(x) Bayes

/ N (z

1

;x,�

1

)N (x;x

0

,⌃

0

) up to normalization

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 6: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|z1

) = p(z

1

|x) p(x) /R

dx p(z

1

|x) p(x) Bayes

/ N (z

1

;x,�

1

)N (x;x

0

,⌃

0

) up to normalization

/ exp

n

�1

2

(x� x

0

)

2

/⌃

2

0

+ (z

1

� x)

2

/�

2

1

⌘o

Ignore constants!

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 7: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|z1

) = p(z

1

|x) p(x) /R

dx p(z

1

|x) p(x) Bayes

/ N (z

1

;x,�

1

)N (x;x

0

,⌃

0

) up to normalization

/ exp

n

�1

2

(x� x

0

)

2

/⌃

2

0

+ (z

1

� x)

2

/�

2

1

⌘o

Ignore constants!

= exp

n

�1

2

(x

2 � 2xx

0

+ x

2

0

)/⌃

2

0

+ (z

2

1

� 2z

1

x+ x

2

)/�

2

1

⌘o

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 8: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|z1

) = p(z

1

|x) p(x) /R

dx p(z

1

|x) p(x) Bayes

/ N (z

1

;x,�

1

)N (x;x

0

,⌃

0

) up to normalization

/ exp

n

�1

2

(x� x

0

)

2

/⌃

2

0

+ (z

1

� x)

2

/�

2

1

⌘o

Ignore constants!

= exp

n

�1

2

(x

2 � 2xx

0

+ x

2

0

)/⌃

2

0

+ (z

2

1

� 2z

1

x+ x

2

)/�

2

1

⌘o

/ exp

n

�1

2

(x

2 � 2xx

0

)/⌃

2

0

+ (�2z1

x+ x

2

)/�

2

1

⌘o

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 9: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|z1

) = p(z

1

|x) p(x) /R

dx p(z

1

|x) p(x) Bayes

/ N (z

1

;x,�

1

)N (x;x

0

,⌃

0

) up to normalization

/ exp

n

�1

2

(x� x

0

)

2

/⌃

2

0

+ (z

1

� x)

2

/�

2

1

⌘o

Ignore constants!

= exp

n

�1

2

(x

2 � 2xx

0

+ x

2

0

)/⌃

2

0

+ (z

2

1

� 2z

1

x+ x

2

)/�

2

1

⌘o

/ exp

n

�1

2

(x

2 � 2xx

0

)/⌃

2

0

+ (�2z1

x+ x

2

)/�

2

1

⌘o

= exp

n

�1

2

x

2

(1/⌃

2

0

+ 1/�

2

1

)� 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)

⌘o

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 10: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|z1

) = p(z

1

|x) p(x) /R

dx p(z

1

|x) p(x) Bayes

/ N (z

1

;x,�

1

)N (x;x

0

,⌃

0

) up to normalization

/ exp

n

�1

2

(x� x

0

)

2

/⌃

2

0

+ (z

1

� x)

2

/�

2

1

⌘o

Ignore constants!

= exp

n

�1

2

(x

2 � 2xx

0

+ x

2

0

)/⌃

2

0

+ (z

2

1

� 2z

1

x+ x

2

)/�

2

1

⌘o

/ exp

n

�1

2

(x

2 � 2xx

0

)/⌃

2

0

+ (�2z1

x+ x

2

)/�

2

1

⌘o

= exp

n

�1

2

x

2

(1/⌃

2

0

+ 1/�

2

1

)� 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)

⌘o

= exp

n

�1

2

x

2

/⌃

2

1

� 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)

⌘o

, 1/⌃

2

1

= 1/⌃

2

0

+ 1/�

2

1

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 11: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|z1

) = p(z

1

|x) p(x) /R

dx p(z

1

|x) p(x) Bayes

/ N (z

1

;x,�

1

)N (x;x

0

,⌃

0

) up to normalization

/ exp

n

�1

2

(x� x

0

)

2

/⌃

2

0

+ (z

1

� x)

2

/�

2

1

⌘o

Ignore constants!

= exp

n

�1

2

(x

2 � 2xx

0

+ x

2

0

)/⌃

2

0

+ (z

2

1

� 2z

1

x+ x

2

)/�

2

1

⌘o

/ exp

n

�1

2

(x

2 � 2xx

0

)/⌃

2

0

+ (�2z1

x+ x

2

)/�

2

1

⌘o

= exp

n

�1

2

x

2

(1/⌃

2

0

+ 1/�

2

1

)� 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)

⌘o

= exp

n

�1

2

x

2

/⌃

2

1

� 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)

⌘o

, 1/⌃

2

1

= 1/⌃

2

0

+ 1/�

2

1

= exp

n

�1

2

x

2 � 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)⌃

2

1

/⌃

2

1

o

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 12: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|z1

) = p(z

1

|x) p(x) /R

dx p(z

1

|x) p(x) Bayes

/ N (z

1

;x,�

1

)N (x;x

0

,⌃

0

) up to normalization

/ exp

n

�1

2

(x� x

0

)

2

/⌃

2

0

+ (z

1

� x)

2

/�

2

1

⌘o

Ignore constants!

= exp

n

�1

2

(x

2 � 2xx

0

+ x

2

0

)/⌃

2

0

+ (z

2

1

� 2z

1

x+ x

2

)/�

2

1

⌘o

/ exp

n

�1

2

(x

2 � 2xx

0

)/⌃

2

0

+ (�2z1

x+ x

2

)/�

2

1

⌘o

= exp

n

�1

2

x

2

(1/⌃

2

0

+ 1/�

2

1

)� 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)

⌘o

= exp

n

�1

2

x

2

/⌃

2

1

� 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)

⌘o

, 1/⌃

2

1

= 1/⌃

2

0

+ 1/�

2

1

= exp

n

�1

2

x

2 � 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)⌃

2

1

/⌃

2

1

o

= exp

n

�1

2

x

2 � 2xx

1

/⌃

2

1

o

, x

1

= (x

0

/⌃

2

0

+ z

1

/�

2

1

)⌃

2

1

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 13: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|z1

) = p(z

1

|x) p(x) /R

dx p(z

1

|x) p(x) Bayes

/ N (z

1

;x,�

1

)N (x;x

0

,⌃

0

) up to normalization

/ exp

n

�1

2

(x� x

0

)

2

/⌃

2

0

+ (z

1

� x)

2

/�

2

1

⌘o

Ignore constants!

= exp

n

�1

2

(x

2 � 2xx

0

+ x

2

0

)/⌃

2

0

+ (z

2

1

� 2z

1

x+ x

2

)/�

2

1

⌘o

/ exp

n

�1

2

(x

2 � 2xx

0

)/⌃

2

0

+ (�2z1

x+ x

2

)/�

2

1

⌘o

= exp

n

�1

2

x

2

(1/⌃

2

0

+ 1/�

2

1

)� 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)

⌘o

= exp

n

�1

2

x

2

/⌃

2

1

� 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)

⌘o

, 1/⌃

2

1

= 1/⌃

2

0

+ 1/�

2

1

= exp

n

�1

2

x

2 � 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)⌃

2

1

/⌃

2

1

o

= exp

n

�1

2

x

2 � 2xx

1

/⌃

2

1

o

, x

1

= (x

0

/⌃

2

0

+ z

1

/�

2

1

)⌃

2

1

= exp

n

�1

2

x

2 � 2xx

1

+ x

2

1

� x

2

1

/⌃

2

1

o

Add Zero!

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 14: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|z1

) = p(z

1

|x) p(x) /R

dx p(z

1

|x) p(x) Bayes

/ N (z

1

;x,�

1

)N (x;x

0

,⌃

0

) up to normalization

/ exp

n

�1

2

(x� x

0

)

2

/⌃

2

0

+ (z

1

� x)

2

/�

2

1

⌘o

Ignore constants!

= exp

n

�1

2

(x

2 � 2xx

0

+ x

2

0

)/⌃

2

0

+ (z

2

1

� 2z

1

x+ x

2

)/�

2

1

⌘o

/ exp

n

�1

2

(x

2 � 2xx

0

)/⌃

2

0

+ (�2z1

x+ x

2

)/�

2

1

⌘o

= exp

n

�1

2

x

2

(1/⌃

2

0

+ 1/�

2

1

)� 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)

⌘o

= exp

n

�1

2

x

2

/⌃

2

1

� 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)

⌘o

, 1/⌃

2

1

= 1/⌃

2

0

+ 1/�

2

1

= exp

n

�1

2

x

2 � 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)⌃

2

1

/⌃

2

1

o

= exp

n

�1

2

x

2 � 2xx

1

/⌃

2

1

o

, x

1

= (x

0

/⌃

2

0

+ z

1

/�

2

1

)⌃

2

1

= exp

n

�1

2

x

2 � 2xx

1

+ x

2

1

� x

2

1

/⌃

2

1

o

Add Zero!

/ exp

n

�1

2

(

x� x

1

)

2

/⌃

2

1

o

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 15: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|z1

) = p(z

1

|x) p(x) /R

dx p(z

1

|x) p(x) Bayes

/ N (z

1

;x,�

1

)N (x;x

0

,⌃

0

) up to normalization

/ exp

n

�1

2

(x� x

0

)

2

/⌃

2

0

+ (z

1

� x)

2

/�

2

1

⌘o

Ignore constants!

= exp

n

�1

2

(x

2 � 2xx

0

+ x

2

0

)/⌃

2

0

+ (z

2

1

� 2z

1

x+ x

2

)/�

2

1

⌘o

/ exp

n

�1

2

(x

2 � 2xx

0

)/⌃

2

0

+ (�2z1

x+ x

2

)/�

2

1

⌘o

= exp

n

�1

2

x

2

(1/⌃

2

0

+ 1/�

2

1

)� 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)

⌘o

= exp

n

�1

2

x

2

/⌃

2

1

� 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)

⌘o

, 1/⌃

2

1

= 1/⌃

2

0

+ 1/�

2

1

= exp

n

�1

2

x

2 � 2x(x

0

/⌃

2

0

+ z

1

/�

2

1

)⌃

2

1

/⌃

2

1

o

= exp

n

�1

2

x

2 � 2xx

1

/⌃

2

1

o

, x

1

= (x

0

/⌃

2

0

+ z

1

/�

2

1

)⌃

2

1

= exp

n

�1

2

x

2 � 2xx

1

+ x

2

1

� x

2

1

/⌃

2

1

o

Add Zero!

/ exp

n

�1

2

(

x� x

1

)

2

/⌃

2

1

o

/ N (x;x

1

,⌃

1

)

with: ⌃

1

=

q

1/(1/⌃

2

0

+ 1/�

2

1

) ! �

1

(⌃

0

� �

1

)

x

1

= ⌃

2

1

(x

0

/⌃

2

0

+ z

1

/�

2

1

) ! z

1

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 16: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Stationary object position from noisy measurements

position: x 2 R, measurements: z

i

with “likelihood”: p(z

i

|x) = N (z

i

;x,�

i

)

each measurement can have its own measurement error �

i

(standard deviation)

Initial knowledge: p(x) = N (x;x

0

,⌃

0

), ⌃

0

� �

1

, flat.

Impact of the first measurement: p(x|z1

) = N (x; z

1

,�

1

)

Impact of k measurement: p(x|zk

, . . . , z

1

) = N (x;x

k

,⌃

k

)

with: 1/⌃

2

k

=

k

X

i=1

1/�

2

i

harmonic mean, x

k

= ⌃

2

k

0

@

k

X

i=1

z

i

/�

2

i

1

A

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 17: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|zk

, . . . , k

1

) / p(z

k

|x) p(x|zk�1, . . . , z1) Bayes

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 18: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|zk

, . . . , k

1

) / p(z

k

|x) p(x|zk�1, . . . , z1) Bayes

/ N (z

k

;x,�

k

)N (x;x

k�1,⌃k�1) induction assumption

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 19: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|zk

, . . . , k

1

) / p(z

k

|x) p(x|zk�1, . . . , z1) Bayes

/ N (z

k

;x,�

k

)N (x;x

k�1,⌃k�1) induction assumption

/ exp

n

�1

2

(x� x

k�1)2

/⌃

2

k�1 + (z

k

� x)

2

/�

2

k

⌘o

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 20: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|zk

, . . . , k

1

) / p(z

k

|x) p(x|zk�1, . . . , z1) Bayes

/ N (z

k

;x,�

k

)N (x;x

k�1,⌃k�1) induction assumption

/ exp

n

�1

2

(x� x

k�1)2

/⌃

2

k�1 + (z

k

� x)

2

/�

2

k

⌘o

/ exp

n

�1

2

(x

2 � 2xx

k�1)/⌃2

k�1 + (�2zk

x+ x

2

)/�

2

k

⌘o

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 21: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|zk

, . . . , k

1

) / p(z

k

|x) p(x|zk�1, . . . , z1) Bayes

/ N (z

k

;x,�

k

)N (x;x

k�1,⌃k�1) induction assumption

/ exp

n

�1

2

(x� x

k�1)2

/⌃

2

k�1 + (z

k

� x)

2

/�

2

k

⌘o

/ exp

n

�1

2

(x

2 � 2xx

k�1)/⌃2

k�1 + (�2zk

x+ x

2

)/�

2

k

⌘o

= exp

n

�1

2

x

2

(1/⌃

2

k�1 + 1/�

2

k

)� 2x(x

k�1/⌃2

k�1 + z

k

/�

2

k

)

⌘o

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 22: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|zk

, . . . , k

1

) / p(z

k

|x) p(x|zk�1, . . . , z1) Bayes

/ N (z

k

;x,�

k

)N (x;x

k�1,⌃k�1) induction assumption

/ exp

n

�1

2

(x� x

k�1)2

/⌃

2

k�1 + (z

k

� x)

2

/�

2

k

⌘o

/ exp

n

�1

2

(x

2 � 2xx

k�1)/⌃2

k�1 + (�2zk

x+ x

2

)/�

2

k

⌘o

= exp

n

�1

2

x

2

(1/⌃

2

k�1 + 1/�

2

k

)� 2x(x

k�1/⌃2

k�1 + z

k

/�

2

k

)

⌘o

= exp

n

�1

2

x

2

/⌃

2

k

� 2x(x

k�1/⌃2

k�1 + z

k

/�

2

k

)

⌘o

, ⌃

�2k

= ⌃

�2k�1 + �

�2k

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 23: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|zk

, . . . , k

1

) / p(z

k

|x) p(x|zk�1, . . . , z1) Bayes

/ N (z

k

;x,�

k

)N (x;x

k�1,⌃k�1) induction assumption

/ exp

n

�1

2

(x� x

k�1)2

/⌃

2

k�1 + (z

k

� x)

2

/�

2

k

⌘o

/ exp

n

�1

2

(x

2 � 2xx

k�1)/⌃2

k�1 + (�2zk

x+ x

2

)/�

2

k

⌘o

= exp

n

�1

2

x

2

(1/⌃

2

k�1 + 1/�

2

k

)� 2x(x

k�1/⌃2

k�1 + z

k

/�

2

k

)

⌘o

= exp

n

�1

2

x

2

/⌃

2

k

� 2x(x

k�1/⌃2

k�1 + z

k

/�

2

k

)

⌘o

, ⌃

�2k

= ⌃

�2k�1 + �

�2k

= exp

n

�1

2

x

2 � 2xx

k

/⌃

2

k

o

, x

k

= (x

k�1/⌃2

k�1 + z

k

/�

2

k

)⌃

2

k

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 24: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|zk

, . . . , k

1

) / p(z

k

|x) p(x|zk�1, . . . , z1) Bayes

/ N (z

k

;x,�

k

)N (x;x

k�1,⌃k�1) induction assumption

/ exp

n

�1

2

(x� x

k�1)2

/⌃

2

k�1 + (z

k

� x)

2

/�

2

k

⌘o

/ exp

n

�1

2

(x

2 � 2xx

k�1)/⌃2

k�1 + (�2zk

x+ x

2

)/�

2

k

⌘o

= exp

n

�1

2

x

2

(1/⌃

2

k�1 + 1/�

2

k

)� 2x(x

k�1/⌃2

k�1 + z

k

/�

2

k

)

⌘o

= exp

n

�1

2

x

2

/⌃

2

k

� 2x(x

k�1/⌃2

k�1 + z

k

/�

2

k

)

⌘o

, ⌃

�2k

= ⌃

�2k�1 + �

�2k

= exp

n

�1

2

x

2 � 2xx

k

/⌃

2

k

o

, x

k

= (x

k�1/⌃2

k�1 + z

k

/�

2

k

)⌃

2

k

= exp

n

�1

2

x

2 � 2xx

k

+ x

2

k

� x

2

k

/⌃

2

k

o

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 25: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|zk

, . . . , k

1

) / p(z

k

|x) p(x|zk�1, . . . , z1) Bayes

/ N (z

k

;x,�

k

)N (x;x

k�1,⌃k�1) induction assumption

/ exp

n

�1

2

(x� x

k�1)2

/⌃

2

k�1 + (z

k

� x)

2

/�

2

k

⌘o

/ exp

n

�1

2

(x

2 � 2xx

k�1)/⌃2

k�1 + (�2zk

x+ x

2

)/�

2

k

⌘o

= exp

n

�1

2

x

2

(1/⌃

2

k�1 + 1/�

2

k

)� 2x(x

k�1/⌃2

k�1 + z

k

/�

2

k

)

⌘o

= exp

n

�1

2

x

2

/⌃

2

k

� 2x(x

k�1/⌃2

k�1 + z

k

/�

2

k

)

⌘o

, ⌃

�2k

= ⌃

�2k�1 + �

�2k

= exp

n

�1

2

x

2 � 2xx

k

/⌃

2

k

o

, x

k

= (x

k�1/⌃2

k�1 + z

k

/�

2

k

)⌃

2

k

= exp

n

�1

2

x

2 � 2xx

k

+ x

2

k

� x

2

k

/⌃

2

k

o

/ exp

n

�1

2

(

x� x

k

)

2

/⌃

2

k

o

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 26: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

p(x|zk

, . . . , k

1

) / p(z

k

|x) p(x|zk�1, . . . , z1) Bayes

/ N (z

k

;x,�

k

)N (x;x

k�1,⌃k�1) induction assumption

/ exp

n

�1

2

(x� x

k�1)2

/⌃

2

k�1 + (z

k

� x)

2

/�

2

k

⌘o

/ exp

n

�1

2

(x

2 � 2xx

k�1)/⌃2

k�1 + (�2zk

x+ x

2

)/�

2

k

⌘o

= exp

n

�1

2

x

2

(1/⌃

2

k�1 + 1/�

2

k

)� 2x(x

k�1/⌃2

k�1 + z

k

/�

2

k

)

⌘o

= exp

n

�1

2

x

2

/⌃

2

k

� 2x(x

k�1/⌃2

k�1 + z

k

/�

2

k

)

⌘o

, ⌃

�2k

= ⌃

�2k�1 + �

�2k

= exp

n

�1

2

x

2 � 2xx

k

/⌃

2

k

o

, x

k

= (x

k�1/⌃2

k�1 + z

k

/�

2

k

)⌃

2

k

= exp

n

�1

2

x

2 � 2xx

k

+ x

2

k

� x

2

k

/⌃

2

k

o

/ exp

n

�1

2

(

x� x

k

)

2

/⌃

2

k

o

/ N (x;x

k

,⌃

k

)

with: ⌃

k

= 1/

v

u

u

u

t

k

X

i=1

1/�

2

i

harmonic mean

x

k

= ⌃

2

k

0

@

k

X

i=1

z

i

/�

2

i

1

A

weighted arithmetic mean

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 27: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Stationary object position from noisy measurements

position: x 2 R, measurements: z

i

with “likelihood”: p(z

i

|x) = N (z

i

;x,�

i

)

each measurement can have its own measurement error �

i

(standard deviation)

Initial knowledge: p(x) = N (x;x

0

,⌃

0

), ⌃

0

� �

1

, flat.

Impact of the first measurement: p(x|z1

) = N (x; z

1

,�

1

)

Impact of k measurement: p(x|zk

, . . . , z

1

) = N (x;x

k

,⌃

k

)

with: 1/⌃

2

k

=

k

X

i=1

1/�

2

i

harmonic mean, x

k

= ⌃

2

k

0

@

k

X

i=1

z

i

/�

2

i

1

A

For � = �

i

8i: ⌃

k

= �/

pk “square-root law”

x

k

=

1

k

k

X

i=1

z

i

arithmetic mean

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 28: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

The Multivariate GAUSSian Pdf

� wanted: probabilities ‘concentrated’ around a center x

� quadratic distance: q(x) =

1

2

(x� x)P�1(x� x)>

q(x) defines an ellipsoid around x, its volume and orienta-

tion being determined by a matrix P (symmetric: P> = P,

positively definite: all eigenvalues > 0).

� first attempt: p(x) = e

�q(x)/

R

dx e

�q(x)(normalized!)

p(x) = N (x; x, P) =

1

q

|2⇡P|e

�1

2

(x�x)>P�1(x�x)

Exercise 4.1 Show:

R

dx e

�q(x)=

p

|2⇡P|, E[x] = x, E[(x� x)(x� x)>] = P

Trick: Symmetric, positively definite matrices can be diagonalized by an orthogonal coordinate transform.

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 29: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

The Multivariate GAUSSian Pdf

� wanted: probabilities ‘concentrated’ around a center x

� quadratic distance: q(x) =

1

2

(x� x)P�1(x� x)>

q(x) defines an ellipsoid around x, its volume and orienta-

tion being determined by a matrix P (symmetric: P> = P,

positively definite: all eigenvalues > 0).

� first attempt: p(x) = e

�q(x)/

R

dx e

�q(x)(normalized!)

p(x) = N (x; x, P) =

1

q

|2⇡P|e

�1

2

(x�x)>P�1(x�x)

E[x] = x, E[(x� x)(x� x)>] = P (covariance)

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 30: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

The Multivariate GAUSSian Pdf

� wanted: probabilities ‘concentrated’ around a center x

� quadratic distance: q(x) =

1

2

(x� x)P�1(x� x)>

q(x) defines an ellipsoid around x, its volume and orienta-

tion being determined by a matrix P (symmetric: P> = P,

positively definite: all eigenvalues > 0).

� first attempt: p(x) = e

�q(x)/

R

dx e

�q(x)(normalized!)

p(x) = N (x; x, P) =

1

q

|2⇡P|e

�1

2

(x�x)>P�1(x�x)

E[x] = x, E[(x� x)(x� x)>] = P (covariance)

� GAUSSian Mixtures: p(x) =

P

i

p

i

N (x; xi

, Pi

) (weighted sums)

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 31: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

A Useful Product Formula for GAUSSians

N�

z; Hx, R�

N�

x; y, P�

= N�

z; Hy, S�

| {z }

independent of x

(

N�

x; y+W⌫, P�WSW>�

N�

x; Q�1(P�1x+H>R�1z), Q�

⌫ = z�Hy, S = HPH>+R, W = PH>S�1, Q�1 = P�1 +H>R�1H.

Sketch of the proof:

• Interpret N�

z; Hx, R�

N�

x; y, P�

as a joint pdf p(z|x)p(x) = p(z,x).

• Show that p(z,x) is a GAUSSian: p(z,x) = N��

zx

;

Hyy

,

S HPPH> P

��

.

• Calculate from p(z,x) the marginal and conditional pdfs p(z) and p(x|z).

• From p(z,x) = p(z|x)p(x) = p(x|z)p(z) = p(x, z) we obtain the result.

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 32: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Affine Transforms of GAUSSian Random Variables

N⇣

x; E[x], C[x]⌘ y=t+Tx�������! N

y; t+TE[x], TC[x]T>⌘

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 33: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Affine Transforms of GAUSSian Random Variables

N⇣

x; E[x], C[x]⌘ y=t+Tx�������! N

y; t+TE[x], TC[x]T>⌘

p(y) =

Z

dx p(x,y) =

Z

dx p(y|x) p(x) =

Z

dx �(y � t�Tx) p(x)

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 34: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Affine Transforms of GAUSSian Random Variables

N⇣

x; E[x], C[x]⌘ y=t+Tx�������! N

y; t+TE[x], TC[x]T>⌘

p(y) =

Z

dx p(x,y) =

Z

dx p(y|x) p(x) =

Z

dx �(y � t�Tx) p(x)

A possible representation: �(x� y) = N⇣

x; y, D⌘

with D! O!

p(y) =

Z

dxN (y; t+Tx,D)N (x;E[x],C[x]) for D! 0

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 35: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Affine Transforms of GAUSSian Random Variables

N⇣

x; E[x], C[x]⌘ y=t+Tx�������! N

y; t+TE[x], TC[x]T>⌘

p(y) =

Z

dx p(x,y) =

Z

dx p(y|x) p(x) =

Z

dx �(y � t�Tx) p(x)

A possible representation: �(x� y) = N⇣

x; y, D⌘

with D! O!

p(y) =

Z

dxN (y; t+Tx,D)N (x;E[x],C[x]) for D! 0

= N⇣

y; t+TE[x], TC[x]T>+D⌘

for D! 0; product formula!

Also true if dim(x) 6= dim(y)!Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 36: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

A popular model for object evolutions

Piecewise Constant White Acceleration Model

Consider state vectors: xk

= (r>k

, r>k

)

>(position, velocity)

For known xk�1 and without external influences we have with �T

k

= t

k

� t

k�1:

xk

=

I �T

k

IO I

!

rk�1rk�1

!

=: Fk|k�1xk�1, see blackboard!

Assume during the interval �T

k

a constant acceleration ak

causing the state evolution:

1

2

�T

2

k

I�T

k

I

!

ak

=: Gk

ak

, linear transform!

Let ak

be a Gaussian RV with pdf: p(ak

) = N⇣

ak

; o, ⌃2

k

I⌘

, we therefore have:

p(Gk

ak

) = N⇣

Gk

ak

; o, ⌃2

k

Gk

G>k

.

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 37: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Therefore: p(xk

|xk�1) = N

xk

; Fk|k�1xk�1, Dk|k�1

with

Fk|k�1 =

I �T

k

IO I

!

, Dk|k�1 = ⌃

2

k

1

4

�T

4

k

I 1

2

�T

3 I1

2

�T

3

k

I �T

2

k

I

!

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 38: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Therefore: p(xk

|xk�1) = N

xk

; Fk|k�1xk�1, Dk|k�1

with

Fk|k�1 =

I �T

k

IO I

!

, Dk|k�1 = ⌃

2

k

1

4

�T

4

k

I 1

2

�T

3

k

I1

2

�T

3

k

I �T

2

k

I

!

Exercise 4.2 Consider xk

= (r>k

, r>k

, r>k

)

>(position, velocity, acceleration)

Show that Fk|k�1 and D

k|k�1 = ⌃

2

k

Gk

G>k

(constant acceleration rates) are given by:

Fk|k�1 =

0

@

I �T

k

I 1

2

�T

2

k

IO I �T

k

IO I I

1

A

, Dk|k�1 = ⌃

2

k

0

@

1

4

�T

4

k

I 1

2

�T

3

k

I 1

2

�T

2

k

I1

2

�T

3

k

I �T

2

k

I �T

k

I1

2

�T

2

k

I �T

k

I I

1

A

with �T

k

= t

k

� t

k�1. Reasonable choice:

1

2

q

max

k

q

max

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 39: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Sensor Fusion: Gain in Localization Accuracy

If a stationary target is observed by N sensors, we na¨ıvely

expect an improvement in accuracy / 1/

pN .

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 40: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Sensor Fusion: Gain in Localization Accuracy

If a stationary target is observed by N sensors, we na¨ıvely

expect an improvement in accuracy / 1/

pN .

a closer look: The error of each measurement zi

is described by a related

measurement error covariance matrix Ri

(‘error ellipsoids’). In 2 dimensions:

s1 s2

s3

Ri

can strongly depend on the underlying senor-to-target geometry!

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 41: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Simplified: Range, Azimuth Measurements

• measurements in polar coordinates:

zk

= (r

k

,'

k

)

>, measurement error: R =

2

r

0

0 �

2

'

, r, ' independent

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 42: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Simplified: Range, Azimuth Measurements

• measurements in polar coordinates:

zk

= (r

k

,'

k

)

>, measurement error: R =

2

r

0

0 �

2

'

, r, ' independent

Likelihood function in polar coordinates:

p(zk

|xk

) = N (zk

; xpk

, Rp

)

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 43: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Simplified: Range, Azimuth Measurements

• measurements in polar coordinates:

zk

= (r

k

,'

k

)

>, measurement error: R =

2

r

0

0 �

2

'

, r, ' independent

Likelihood function in polar coordinates:

p(zk

|xk

) = N (zk

; xpk

, Rp

)

• What is the likelihood function in Cartesian coordinates?

t[zk

] = r

k

cos'

k

sin'

k

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 44: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Simplified: Range, Azimuth Measurements

• measurements in polar coordinates:

zk

= (r

k

,'

k

)

>, measurement error: R =

2

r

0

0 �

2

'

, r, ' independent

• in Cartesian coord.: expand around rk|k�1 = (r

k|k�1, 'k|k�1)

>:

t[zk

] = r

k

cos'

k

sin'

k

⇡ t[rk|k�1] +T (z

k

� rk|k�1)

constant and linear term of a Taylor series only, blackboard!

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 45: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Simplified: Range, Azimuth Measurements

• measurements in polar coordinates:

zk

= (r

k

,'

k

)

>, measurement error: R =

2

r

0

0 �

2

'

, r, ' independent

• in Cartesian coord.: expand around rk|k�1 = (r

k|k�1, 'k|k�1)

>:

t[zk

] = r

k

cos'

k

sin'

k

⇡ t[rk|k�1] +T (z

k

� rk|k�1)

T =

@t[rk|k�1]

@rk|k�1

=

cos'

k|k�1 �rk|k�1 sin'

k|k�1sin'

k|k�1 r

k|k�1 cos'k|k�1

=

cos' � sin'

sin' cos'

| {z }

rotation D'

1 0

0 r

| {z }

dilation Sr

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 46: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Simplified: Range, Azimuth Measurements

• measurements in polar coordinates:

zk

= (r

k

,'

k

)

>, measurement error: R =

2

r

0

0 �

2

'

, r, ' independent

• in Cartesian coord.: expand around rk|k�1 = (r

k|k�1, 'k|k�1)

>:

t[zk

] = r

k

cos'

k

sin'

k

⇡ t[rk|k�1] +T (z

k

� rk|k�1)

T =

@t[rk|k�1]

@rk|k�1

=

cos'

k|k�1 �rk|k�1 sin'

k|k�1sin'

k|k�1 r

k|k�1 cos'k|k�1

=

cos' � sin'

sin' cos'

| {z }

rotation D'

1 0

0 r

| {z }

dilation Sr

• affine transform of GAUSSian random variables:

N�

z; x, R� z0=t+Tz������! N

z0; t+Tx, TRT>�

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 47: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Simplified: Range, Azimuth Measurements

• measurements in polar coordinates:

zk

= (r

k

,'

k

)

>, measurement error: R =

2

r

0

0 �

2

'

, r, ' independent

• in Cartesian coord.: expand around rk|k�1 = (r

k|k�1, 'k|k�1)

>:

t[zk

] = r

k

cos'

k

sin'

k

⇡ t[rk|k�1] +T (z

k

� rk|k�1)

T =

@t[rk|k�1]

@rk|k�1

=

cos'

k|k�1 �rk|k�1 sin'

k|k�1sin'

k|k�1 r

k|k�1 cos'k|k�1

=

cos' � sin'

sin' cos'

| {z }

rotation D'

1 0

0 r

| {z }

dilation Sr

• Cartesian error covariance (time dependent):

TRT> = D'

Sr

R Sr

D

>'

= D'

2

r

0

0 (r�

'

)

2

D>'

• sensor fusion: sensor-to-target-geometry enters into TRT>

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 48: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

s1 s2

s3

sensor fusion: sensor-to-target-geometry enters into TRT>

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 49: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Filtering step: alternative formulation

p

xk

| Zk

= p

xk

| zk

,Zk�1�(current measurement)

=

p

zk

|xk

p

xk

| Zk�1�

R

dxk

p

zk

|xk

p

xk

| Zk�1�

(BAYES’ rule)

=

N�

zk

; Hk

xk

, Rk

N�

xk

; xk|k�1, Pk|k�1

R

dxk

N�

zk

; Hk

xk

, Rk

| {z }

likelihood function

N�

xk

; xk|k�1, Pk|k�1

| {z }

prediction for t

k

= N�

xk

; xk|k, Pk|k

(product formula: 2. version!)

xk|k = P�1

k|k�

P�1k|k�1xk|k�1 +H>

k

R�1k

zk

P�1k|k = P�1

k|k�1 +H>k

R�1k

H

inverse covariance matrices are called information matrices.

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 50: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

ϕ π ϕ−

S1 S2 R

r r

X

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 51: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Special case: stationary object

Example: different sensors

F = I D = OH = I R

k

time dependent!

Initiation: x1|1 = z

1

, P1|1 = R

1

Prediction: xk|k�1 = F

k|k�1xk�1|k�1, Pk|k�1 = F

k|k�1Pk�1|k�1F>k|k�1 +D

k|k�1

Filtering: xk|k = P�1

k|k�

P�1k|k�1xk|k�1 +H>

k

R�1k

zk

(2. formulation)

P�1k|k = P�1

k|k�1 +H>k

R�1k

H

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 52: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Special case: stationary object

Example: different sensors

F = I D = OH = I R

k

time dependent!

Initiation: x1|1 = z

1

, P1|1 = R

1

Prediction: xk|k�1 = x

k�1|k�1, Pk|k�1 = P

k�1|k�1

Filtering: xk|k = P�1

k|k�

P�1k�1|k�1xk�1|k�1 +R�1

k

zk

P�1k|k = P�1

k�1|k�1 +R�1k

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 53: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Special case: stationary object

Example: different sensors

F = I D = OH = I R

k

time dependent!

Initiation: x1|1 = z

1

, P1|1 = R

1

Prediction: xk|k�1 = x

k�1|k�1, Pk|k�1 = P

k�1|k�1

Filtering: xk|k = P�1

k|k�

P�1k�1|k�1xk�1|k�1 +R�1

k

zk

= P�1k|k

k

X

i=1

R�1i

zi

P�1k|k = P�1

k�1|k�1 +R�1k

=

k

X

i=1

R�1i

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 54: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Special case: stationary object

Example: different sensors

F = I D = OH = I R

k

time dependent!

Initiation: x1|1 = z

1

, P1|1 = R

1

Filtering: xk|k = P

k|k

k

X

i=1

R�1i

zi

, Pk|k =

k

X

i=1

R�1i

⌘�1

Kalman filter! weighted, recursive, arithmetic mean

estimation error covariance matrix: harmonic mean of measurement error matrices!

ϕ π ϕ−

S1 S2 R

r r

X

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018

Page 55: Multiple Hypothesis Tracking: Basic Idea€¦ · Multiple Hypothesis Tracking: Basic Idea Iterative updating of conditional probability densities! kinematic target state xk at time

Discussion: stationary objects

• If all measurement error covariances R

i

, i = 1, . . . , k are identical,

we observe the statistical “square-root effect”: P

k|k = R/k

• If the corresponding error ellipses are significantly different in their

geometric extension, we can observe a much larger effect.

• statistical “intersection” of error ellipses: harmonic mean!

• In the limiting case of very eccentric error ellipses, we obtain

triangulation of a position from bearings (! multiple sensor data

fusion!).

• These considerations are valid also for 3D and more abstract mea-

surements. The corresponding intersections: not intuitively clear.

Sensor Data Fusion - Methods and Applications, 4th Lecture on May 2, 2018