Lecture8 - Linköping Universityusers.isy.liu.se/rt/fredrik/edu/sensorfusion/lecture8.pdf · Lecture8: Whiteboard: I SLAMproblemformulation. I FrameworkforEKF-SLAMandfastSLAM(withPFandMPF).

Post on 11-Jul-2020

0 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

Transcript

Sensor Fusion, 2014 Lecture 8: 1

Lecture 8:

Whiteboard:I SLAM problem formulation.I Framework for EKF-SLAM and fastSLAM (with PF and MPF).

Slides:I Algorithms.I Properties.I Examples and illustrations.

Sensor Fusion, 2014 Lecture 8: 2

Lecture 7: SummaryGeneral SIS PF algorithmChoose the number of particles N, a proposal densityq(x i

k |x i0:k−1, y1:k), and a threshold Nth (for instance Nth = 2N/3).

Initialization: Generate x i0 ∼ px0 , i = 1, . . . ,N. Iterate for

k = 1, 2, . . . :1. Measurement update: For i = 1, 2, . . . ,N,

w ik = w i

k−1p(yk |x i

k)p(x ik |x i

k−1)

q(x ik |x i

k−1, yk).

2. Normalize: w ik := w i

k/∑

i wik .

3. Estimation: MAP (↔ argmaxi w ik) or MMSE

xk ≈∑N

i=1 w ikx i

k .4. Resampling: Resample with replacement when

Neff = 1∑i (w

ik)

2 < Nth.

5. Prediction: Generate samples x ik+1 ∈ q(xk |x i

k−1, yk).

Sensor Fusion, 2014 Lecture 8: 3

Problem formulations

I Localization concerns the estimation of pose from knownlandmarks.

I Navigation concerns estimation of pose, velocities and otherstates from known landmarks.

I Mapping concerns the estimation of landmark positions fromknown values of pose.

I SLAM concerns the joint estimation of pose and landmarkpositions.

I Variations on the same theme: Simultaneous navigation andmapping SNAM?! and Simultaneous tracking and mappingSTAM?!.

Sensor Fusion, 2014 Lecture 8: 4

EKF SLAM: model

Assume linear(-ized) model

xk+1 = Axk + Bvk , Cov(vk) = Qmk+1 = mk ,

yk = C xk xk + Cm

k (c1:Ikk )mk + ek , Cov(ek) = R.

Association indeces (c1:Ikk ) relate an observed landmark i to a map

landmark ji , which affects the measurement model.Association crucial for some sensors (laser, radar, etc), but less of aproblem some applications (camera using image features,microphones using designed pings).State and its covariance matrix

zk|k =

(xk|kmk|k

), Pk|k =

(Pxx

k|k Pxmk|k

Pmxk|k Pmm

k|k

).

Sensor Fusion, 2014 Lecture 8: 5

Original Application

I Assume a ground robot with three states: x = (X ,Y , ψ)T .I Robot measures speed and turn rate: u = (v , ψ)T .I Simple dynamics.I Sensor:

1. Distance sensor (sonar, laser scanner, radar) measures distanceto obstacles (walls, furnitures). Tens to hundreds of landmarks.

2. Vision (camera, Kinect, stereo camera) provides detections(corners, markers, patterns) as potential landmarks.Thousands or tens of thousands of landmarks.

I Pxxk|k small matrix, Pmx

k|k thin matrix and Pmmk|k large matrix.

Both EKF and PF apply to the problem, but how to handle thelarge dimensions in the best way? Start with studying the basicEKF.

Sensor Fusion, 2014 Lecture 8: 6

EKF SLAM: basic KF stepsTime update:

zk|k−1 =

(A 00 I

)zk−1|k−1,

Pk|k−1 =

(AkPxx

k−1|k−1ATk + BkQkBT

k AkPxmk−1|k−1

Pmxk−1|k−1A

Tk Pmm

k−1|k−1

)

Measurement update:

Sk = Cxk Pxx

k|k−1CxTk + Cm

k Pmmk|k−1C

mTk + Cm

k Pmxk|k−1C

xTk + Cx

k Pxmk|k−1C

mTk + Rk ,

Kxk =

(Cx

k Pxxk|k−1 + Cm

k Pmxk|k−1

)S−1

k ,

Kmk =

(Cx

k Pxmk|k−1 + Cm

k Pmmk|k−1

)S−1

k ,

εk = yk − Cxk xk|k−1 − Cm

k mk|k−1,

zk|k = zk|k−1 +

(Kx

kKm

k

)εk ,

Pk|k = Pk|k−1 −(

Kxk

Kmk

)S−1

k(Kx

k Kmk)

Sensor Fusion, 2014 Lecture 8: 7

EKF SLAM: KF problems

I All elements in Pmmk|k are affected by the measurement update.

I It turns out that the cross correlations are essential forperformance.

I No simple turn-around.

Sensor Fusion, 2014 Lecture 8: 8

EKF SLAM: information form

Focus on sufficient statistics and information matrix

ιk|l = Ik|l zk|l ,

Ik|l = P−1k|l =

(Pxx

k|l Pxmk|l

Pmxk|l Pmm

k|l

)−1

=

(Ixx

k|l Ixmk|l

Imxk|l Imm

k|l

)

Measurement update trivial

ιk|k = ιk|k−1 + CTk R−1

k yk ,

Ik|k = Ik|k−1 + CTk R−1

k Ck .

Note the sparse update!!

Sensor Fusion, 2014 Lecture 8: 9

EKF SLAM: information filter algorithm (1/4)

Initialization:

ιx1|0 = 03,1,

ιm1|0 = 00,0,

Ixx1|0 = 03,3,

Imx1|0 = 00,3,

Imm1|0 = 00,0.

Note the advantage of representing no prior knowledge with zeroinformation (infinite covariance).

Sensor Fusion, 2014 Lecture 8: 10

EKF SLAM: information filter algorithm (2/4)

1. Associate a map landmark j = c ik to each observed landmark j ,

and construct the matrix Cmk . This step includes data gating for

outlier rejection and track handling to start and end landmarktracks.2. Measurement update:

ιxk|k = ιxk|k−1 + C xTk R−1

k yk ,

ιmk|k = ιmk|k−1 + CmTk R−1

k yk ,

Ixxk|k = Ixx

k|k−1 + C xTk R−1

k C xk ,

Ixmk|k = Ixm

k|k−1 + C xTk R−1

k Cmk ,

Immk|k = Imm

k|k−1 + CmTk R−1

k Cmk .

Note that Cmk is very thick, but contains mostly zeros.

All low-rank sparse corrections influencing only a fraction of thematrix elements.

Sensor Fusion, 2014 Lecture 8: 11

EKF SLAM: information filter algorithm (3/4)3. Time update:

Ixxk|k−1 = A−1

k Ixxk−1|k−1A

−Tk ,

Ixmk|k−1 = A−1

k Ixmk−1|k−1,

Mk = Bk

(BT

k A−1k I

xxk−1|k−1A

−Tk + Q−1

k

)−1BT

k ,

Ixxk|k−1 = Ixx

k|k−1 − Ixxk|k−1Mk Ixx

k|k−1,

Ixmk|k−1 = Ixm

k|k−1 − Ixxk|k−1Mk Ixm

k|k−1,

Immk|k−1 = Imm

k|k−1 − Imxk|k−1MkBT

k Ixmk|k−1,

ιxk|k−1 =(I − Ixx

k|k−1BkQkBTk AT

k

)ιxk−1|k−1,

ιmk|k−1 = ιmk−1|k−1 − Imxk|k−1BkQkBT

k ATk ι

xk|k−1

Now, Immk|k−1 is corrected with the inner product of Ixm

k|k−1 whichgives a full matrix. Many of the elements in Ixm

k|k−1 are close tozero and may be truncated!

Sensor Fusion, 2014 Lecture 8: 12

EKF SLAM: information filter algorithm (4/4)

4. Estimation:

Pk|k = I−1k|k ,

xk|k = Pxxk|kι

xk|k + Pxm

k|k ιmk|k ,

mk|k = Pmxk|k ι

xk|k + Pmm

k|k ιmk|k .

Here is another catch, the information matrix needs to be inverted!Pose is needed at all times for linearization and data gating. Howto proceed?Idea: Solve

ιk|l = Ik|l zk|l ,

directly using a gradient search algorithm initialized at previousestimate.

Sensor Fusion, 2014 Lecture 8: 13

EKF SLAM: summary

I EKF SLAM scales well in state dimension.I EKF SLAM scales badly in landmark dimension, though

natural approximations exist for the information form.I EKF SLAM is not robust to incorrect associations.

Sensor Fusion, 2014 Lecture 8: 14

FastSLAM: idea

Basic factorization idea:

p(x1:k ,m|y1:k) = p(m|x1:k , y1:k)p(x1:k |y1:k).

I The first factor corresponds to a classical mapping problem,and is solved by the (E)KF.

I The second factor is approximated by the PF.I Leads to a MPF where each particle is a pose trajectory with

an attached map corresponding to mean and covariance ofeach landmark, but NO cross-correlations.

Sensor Fusion, 2014 Lecture 8: 15

FastSLAM: the mapping solution

Assume observation model linear(-ized) in landmark position

0 = h0(ynk , xk) + h1(yn

k , xk)mln + enk , Cov(en

k ) = Rnk .

This formulation covers:I First order Taylor expansions.I Bearing and range measurements, where hi (yn

k , xk) has tworows per landmark in 2D SLAM.

I Bearing-only measurements coming from a camera detection.

Sensor Fusion, 2014 Lecture 8: 16

FastSLAM: the mapping solution

Linear estimation theory applies.WLS estimate:

ml =

N∑

k=1

h1T (ynlk , xk)(R

nlk )−1h1(y

nlk , xk)︸ ︷︷ ︸

IlN

−1

N∑k=1

−h1T (ynlk , xk)(R

nlk )−1h0(y

nlk , xk)︸ ︷︷ ︸

ιlN

.

In this sum, hi is an empty matrix if the map landmark n does notget an associated observation landmark at time k .Under a Gaussian noise assumption, the posterior distribution isGaussian (

ml |y1:N , x1:N

)∈ N ((I l

N)−1ιlN , (I lN)−1).

Sensor Fusion, 2014 Lecture 8: 17

FastSLAM: the mapping solution

Kalman filter for mapping on information form

ιlk = ιlk−1 + h1T (ynlk , xk)R−1

k h0(ynlk , xk),

I lk = I l

k−1 + h1T (ynlk , xk)R−1

k h1(ynlk , xk),

ml = (I lk)−1ιlk .

Note the problem of inverting a large matrix to compute thelandmark positions.Likelihood in the Gaussian case:

p(ynlk |y1:k−1, x1:k) =

N(h0(ynl

k , xk) + h1(ynlk , xk)ml

k−1,Rnlk + h1(ynl

k , xk)(I lk)−1h1T (ynl

k , xk)).

Sensor Fusion, 2014 Lecture 8: 18

FastSLAM: the algorithm (1/2)

1. Initialize the particles

x (i)1 ∼ p0(x),

where N denotes the number of particles.2. Data association that assigns a map landmark nl to eachobserved landmark l . Initialize new map landmarks if necessary.3. Importance weights

w (i)k =

∏l

N(h0(y l

k , xk) + h1(y lk , xk)m

nlk−1,R

lk + h1(y l

k , xk)(Inlk )−1h1T (y l

k , xk)).

where the product is taken over all observed landmarks l , andnormalize w (i)

k = w (i)k /

∑Nj=1 w (j)

k .4. Resampling a new set of particles with replacement

Sensor Fusion, 2014 Lecture 8: 19

FastSLAM: the algorithm (2/2)5. Map measurement update:

p(m(i)|x (i)1:k , y1:k) = N

((I(i)k

)−1ι(i)k ,(I(i)k

)−1),

ιk = ιk−1 + h1T (yk , xk)R−1k h0(yk , xk),

Ik = Ik−1 + h1T (yk , xk)R−1k h1(yk , xk).

6. Pose time update:fastSLAM 1.0 (SIR PF)

x (i)k+1 ∼ p(xk+1|x

(i)1:k).

fastSLAM 2.0 (PF with optimal proposal)

x (i)k+1 ∼ p(xk+1|x

(i)1:k , y1:k+1)

=1C

p(xk+1|x(i)1:k)p(yk+1|xk+1, x

(i)1:k , y1:k).

Sensor Fusion, 2014 Lecture 8: 20

FastSLAM: summary

FastSLAM is ideal for a ground robot with three states and visionsensors providing thousands of landmarks.

I FastSLAM scales linearly in landmark dimension.I As the standard PF, FastSLAM scales badly in the state

dimension.I FastSLAM is relatively robust to incorrect assocations, since

associations are local for each particle and not global as inEKF-SLAM.

MfastSLAM combines all the good landmarks of the EKF SLAMand fastSLAM!

Sensor Fusion, 2014 Lecture 8: 21

MFastSLAM: idea

Factorization in three factors

p(xP1:k , x

jk ,mk |y1:k) = p(mk |x j

k , xP1:k , y1:k)p(x j

k |xP1:k , y1:k)p(xP

1:k |y1:k).

corresponding toI One low-dimensional PF.I One (E)KF attached to each particle.I One WLS estimate to each particle and each map landmark.

Sensor Fusion, 2014 Lecture 8: 22

Illustration

Airborne SLAM using UAV and camera.Research collaboration with IDA.Comparison of EKF-SLAM and FastSLAM on same dataset.

http://youtu.be/VQlaCHl3Yc4 http://youtu.be/hA_NZeuoy9Y

Sensor Fusion, 2014 Lecture 8: 23

Examples

I LEGO Mindstorm ground robot with sonarhttp://youtu.be/7K8dZwqBSSA

I Indoor mapping by UAV with laser scannerhttp://youtu.be/IMSozUpFFkU

I Indoor mapping using hand-held stereo camera with IMU atFOI http://youtu.be/7f-nqXmo1qE

I Intelligent vacuum cleaner using ceiling visionhttp://youtu.be/bq5HZzGF3vQ

I App Ball Invasion uses augmented reality based on SNAM(KTH student did the SLAM implementation)http://youtu.be/WHGtvdxTVZk

top related