Top Banner
Sensor-Based Mapping and Sensor Fusion By Rashmi Patel
38

Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Jan 04, 2016

Download

Documents

Joleen Hunter
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Sensor-Based Mapping andSensor Fusion

By Rashmi Patel

Page 2: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Overview

Bayes Rule Occupancy Grids [Alberto Elfes, 1987] Vector Field Histograms-[J. Borenstein,

1991] Sensor Fusion [David Conner, 2000]

Page 3: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Posterior (Conditional) probability – probability assigned to a event given some evidence

Conditional Prob. Example- Flipping coins: P(H) = 0.5 P(H | H) = 1 P(HH) = 0.25 P(HH | first flip H) =0.5

Bayes Rule

Page 4: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Bayes Rule- continued

Bayes Rule:P(A|B) = P(A)P(B|A) P(B)

The useful thing about Bayes Rule is that it allows you to “turn” conditional probabilities around

Example: P(Cancer) = 0.1, P(Smoker) = 0.5, P(S|C) = 0.8

P(C|S) = ?

P(C|S) = P(S|C)*P(C) / P(S) = 0.16

Page 5: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids [Elfes]

In the mid 80’s Elfes starting implementing cheap ultrasonic transducers on an autonomous robot

Because of intrinsic limitations in any sonar, it is important to compose a coherent world-model using information gained from multiple reading

Page 6: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids Defined

The grid stores the probability that Ci = cell(x,y) is occupied O(Ci) = P[s(Ci) = OCC](Ci)

Phases of Creating a Grid: Collect reading generating O(Ci) Update Occ. Grid creating a

map Match and Combine maps from

multiple locations

x

y

Ci

Page 7: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids Sonar Pattern

24 transducers, in a ring, spaced 15 degrees a part

Sonars can detect from 0.9-35 ft

Accuracy is 0.1 ft

Main sensitivity in a 30 cone

-3db sensitivity from middle 15

(1/2 response)Beam Pattern

22.5 deg

Page 8: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids Sonar Model

Probability Profile- Gaussian p.d.f. is used but that is variable

p(r | z,)=1/(2r)*exp[ (-(r-z)2/2r2) - ((2)/

2)]

Where r is the sensor reading and z is actual distance

Range Measurement

Probably Empty

Rmin

Ranging error

Somewhere Occupied

Distance R

angle

Page 9: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids Discrete sonar model

.45 .48 .49 .5 0.8

.45 .4 .38 .35 .32 .31 .32 .38 .41 .45 0.9

NA 0 .01 .02 .05 .09 .14 .22 .25 .3 .35 .4 .44 1

.45 .4 .38 .35 .32 .31 .32 .38 .41 .45 0.9

.49 0.5 0.8

EX.

Page 10: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids Notation

Definitions: Ci is a cell in the

occupancy grid

s(Ci) is the state of cell Ci

(i.e. value of that cell)

OCC means OCCUPIED and whose value is 1

x

y

Ci

Page 11: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids Bayes Rule

Applying Baye’s Rule to a single cell s(Ci) with sensor reading r:

P[s(Ci) = OCC | r] = P[r | s(Ci) = OCC] * P[s(Ci) = OCC] --------------------------------------------------------------------------------

p[r | s(Ci)] * P[s(Ci)]

Where p(r) = p[r | s(Ci)] * P[s(Ci)] summed over the cells that intercept the sensor model

Then apply this to all the cells creating a local map for each sensor

Page 12: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids Bayes Rule Implemented

P[s(Ci) = OCC | r] = P[r | s(Ci) = OCC] * P[s(Ci) = OCC] -----------------------------------------------------------------------------------------

p[r | s(Ci) = OCC] * P[s(Ci) = OCC]

P[s(Ci) = OCC | r] is the probability that a cell is Occupied given a sensor reading r

P[r | s(Ci) = OCC] is the probability that sensor reading is ‘r’ given the state of cell Ci (this value is found by using the sensor model)

P[s(Ci) = OCC] is the probability that the value of cell Ci is 1 or that s(Ci) = OCC (this value is taken from the occupancy grid)

Likelihood Prior

Normalize

Page 13: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids Implementation Ex

Let the red oval be the somewhere Occupied region

The Yellow blocks are in the sonar sector

The black lines are the boundaries of that sonar sector

P(r) = Sum over all of those yellow block using the sonar model to figure out the Probability P(r)

x

y

Occupied Range

Page 14: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids Multiple Sonars

Combining Readings from Multiple Sonars:

The Grid is updated sequentially for t sensors {r}t = {r1,…,rt}

To update for new sensor reading rt+1:

P[s(Ci) = OCC | rt+1] =

P[rt+1 | s(Ci) = OCC] * P[s(Ci) = OCC|{r}t] --------------------------------------------------------------------------------

p[rt+1 | s(Ci)] * P[s(Ci) |{r}t]

Page 15: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids Equations

P[s(Ci) = OCC | rt+1] =

P[rt+1 | s(Ci) = OCC] * P[s(Ci) = OCC|{r}t] -------------------- -----------------------------------------------------------------------------------------

P[rt+1 | s(Ci) = OCC] * P[s(Ci) = OCC|{r}t]

P[s(Ci) = OCC | rt+1] is the probability that a cell is Occupied given a sensor reading r

P[rt+1 | s(Ci) = OCC] is the probability that sensor reading is ‘r’ given the state of cell Ci (this value is found by using the sensor model)

P[s(Ci) = OCC|{r}t] is the probability that the value of cell Ci is 1 or that s(Ci) = OCC (this value is taken from the occupancy grid)

Page 16: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids Multiple Maps

Matching Multiple Maps Each new map must be

integrated with existing maps from past sensor readings

The maps are integrated by finding the best rotation and translation transform which results in the maps having best correlation in overlapping areas

Occupancy Grid

Page 17: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids Matching Maps Ex

Example 1: A simple translation of maps

Center of Robot at (2,2)

0 0 0 0 0

0 .9 .9 .9 0

0 0 0 0 0

0 0 0 0 0

0 0 0 0 0

Map 1: Map 2: After translating 0 .8 .8 .8 0

0 0 0 0 0

0 0 0 0 0

0 0 0 0 0

0 0 1 0 0

0 .98 .98 .98

0

0 0 0 0 0

0 0 0 0 0

0 0 0 0 0

0 0 1 0 0

New Map: Combined map 1&2

P(cell3) = P(cell1)+P(cell2)- P(cell1)*P(cell2)

Page 18: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids Vs Certainty Grids Occupancy Grids and Certainty Grids basically

the same in the method that is used to Collect readings to generate Probability Occupied

and for Certainty grids Probability Empty Create grid from different sonars Match Maps to register from other locations

Difference arise from the fact that Occ. Grids use conditional prob. to determine Probability Occupied while Certainty Grids use simpler math models

Page 19: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Occupancy Grids Vs Certainty Grids

Both have a P.d.f for the sonar model

However the major difference is in finding the probability that a cell is occupied First Pempty is computed for a cell Then Poccupied is computed using Pocc = 1-

Pemp Then Pocc is normalized over the sonar beam and

combined with the value of that cell from other sonars and Pocc(sonar reading r)

Page 20: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Vector Field Histograms [Borenstien]

The VFH allows fast continuous control over a mobile vehicle

Tested on CARMEL using 24 ultrasonic sensors placed in a ring around the robot

The scan times range from 100 to 500 ms depending on the level of safety wanted

Page 21: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Vector Field Histograms Notation

The VFH uses a two dimensional Cartesian Histogram grid similar to certainty grids [Elfes]

Definition: CVmax = 15 Cvmin = 0 d is the distance returned by the

sonar Increment value is 3 Decrement value is –1 VCP is Vehicle center point Obstacle vector-vector point from cell

to VCP

Page 22: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Vector Field Histograms Histogram Grid

The histogram grid is incremented differently from the certainty grid

The only cell incremented in the grid is the cell which is distance d away and lying on the acoustic axis of the sonar

Similarly only the cells on the acoustic axis and are less than distance d are decremented

Page 23: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Vector Field HistogramsPolar Histogram

Next the 2-D histogram grid is converted into a 1-D grid called the Polar histogram

The Polar Histogram, H, has n angular sections with width

3 2 4 14 1 1 3 2 1

11 24 3 13 4 12 2 2 3 1

3 1 4 2 13 3 4 5 1 14 5 3

Robot

Active Grid, C*

ktarg

Page 24: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Vector Field Histograms H mapping

In order to generate H, we must map an every cell in the histogram grid into H

jijiji

ji

bdacm

xi

x

yj

y

,

2*,,

,

0

0arctan

VCP the to y ,x from direction the

position scell' active they ,x

position srobot' the y ,x

Where

,

th

00

jiji

ji i,j

x0, y0

xi, yj

mi,,j

i,,j

jiji

jiji

jiji

m

d

c

a,b

y ,x atvector obstacle the of magnitude the

VCP the and y ,x between distance the

y ,x of value certainty the

constants positive

.

,

*,

Page 25: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Vector Field Histograms Discrete H grid

Now that the Object vectors for every cell have been computed, we have to find the magnitude of each sector in H

jijik

ji

mh

k

,,

,int

density obstaclepolar the

ofsector the

of resolutionangular the

Where

k

ji

h

,yxk

H

Page 26: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Vector Field Histograms Threshold

Once the Polar Object Densities have been computed, H can be threshold to determine where the objects are located so that they can be avoided.

The choose of this threshold is important. Choosing to high a threshold and you may come too close to a object and too low may cause you to lose some valid paths

Page 27: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Sensor Fusion [D. Conner]

David C Conner PhD Student

Presentation on his thesis and the following paper:“Multiple camera, laser rangefinder, and encoder data fusion for navigation of a differentially steered 3-wheeled autonomous vehicle”

Page 28: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Sensor Fusion Navigator

2 front wheels are driven and third rear wheel is a caster

2 separate computer systems, PC handles sensor fusion and PLC handles motor control

180 degree laser range finder with 0.5 resolution

2 color CCD cameras

Navigator- 3wheeled differentially driven vehicle

Page 29: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Cameras and Frame Grabbers

Because the camera is not parallel to the ground the image must be transformed to correctly represent the ground

The correction is done using the Intel Image Processing Library (IPL)

Page 30: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Since there are two cameras the two images must be combined

The images are transformed into the vehicle coordinates and combined using the IPL functions

Cameras and Frame Grabbers

Page 31: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Image ConversionOnce a picture is captured It is converted to gray scale

It is blurred using a gaussian Convolution maskExample shown is below

Page 32: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Image Conversion continued

Then the threshold of image is taken to limit the amount of data in the image

The threshold value is chosen to be above the norm of the intensities from the gray scale histogram

Then resulting image is pixilated to store in a grid

Page 33: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Laser Range Finder SICK LMS-200 laser rangefinder return 361 data

points for a 180 degree arc with 0.5 degree resolution

Values above a certain range or ignored

Page 34: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Vector Field Histograms VFH are nice because it allows us to

easily combine our camera data and our laser rangefinder data to determine most accessible regions

Several types of polar obstacle density (POD) functions can be used (linear, quadratic, exponential)

POD = KC(a-b*d)

Page 35: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

POD values-Laser Rangefinder

The POD values for the laser are determined by: Using the linear function shown above to

transform the laser data into POD values

Then for every two degrees a max of the POD values in that arc is chosen as the final value

Page 36: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

POD values-Images

POD values for the image are pre-calculated and stored in a grid at startup

The pre-calculated values are multiplied by the pixilated image also stored in a grid. (The overlapping cells would be multiplied)

For every 2degree arc the cell with the highest POD values is chosen to the value of that arc

Page 37: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Combining VFH

The two VFH’s are then combined by taking the max POD for each sector

The max POD is chosen because that represents the closest object

Page 38: Sensor-Based Mapping and Sensor Fusion By Rashmi Patel.

Bibliography Elfes, A. “Occupancy Grids: A Stochastic Spatial

Representation for Active Robot Perception.” July 1990

Elfes, A. “Sonar-Based Real-World Mapping and Navigation”, June 1987

Borenstein, J. “The Vector Field Histogram- Fast Obstacle Avoidance for Mobile Robots”, June 1991

Conner, D. “Multiple camera, laser rangefinder, and encoder data fusion for navigation of a differentially steered 3-wheeled autonomous vehicle”,