Page 1
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Perception
Sensors Uncertainty Features
4
Perception Motion Control
Cognition
Real WorldEnvironment
Localization
PathEnvironment ModelLocal Map
"Position" Global Map
Page 2
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Example HelpMate, Transition Research Corp.
4.1
Page 3
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Example B21, Real World Interface
4.1
Page 4
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Example Robart II, H.R. Everett
4.1
Page 5
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Savannah, River Site Nuclear Surveillance Robot
4.1
Page 6
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
BibaBot, BlueBotics SA, Switzerland
Pan-Tilt Camera
Omnidirectional Camera
IMUInertial Measurement Unit
Sonar Sensors
Laser Range Scanner
Bumper
Emergency Stop Button
Wheel Encoders
4.1
Page 7
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Our new robot: Killianunder development
gripper with sensors: gripper with sensors: IR rangefindersIR rangefinders
strain gaugestrain gauge
top sonar ringtop sonar ring
bottom sonar ringbottom sonar ring
laser range-finderlaser range-finder
stereo visionstereo vision
laptop brainlaptop brain
Page 8
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
General Classification (Table 4.1)
4.1.1
Page 9
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
General Classification (Table 4.1, cont.)
4.1.1
Page 10
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Sensor Terminology
Sensitivity Dynamic Range Resolution Bandwidth Linearity Error Accuracy Precision Systematic Errors Random Errors
Page 11
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Active Ranging Sensors : Ultrasonic sensor
4.1.6
transmit a packet of (ultrasonic) pressure waves distance d of the echoing object can be calculated based on the
propagation speed of sound c and the time of flight t.
The speed of sound c (340 m/s) in air is given by
where
: ration of specific heats
R: gas constant
T: temperature in degree Kelvin
TRc ..
2
* tcd
4.1.6
Page 12
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Ultrasonic Sensor (time of flight, sound)
Transmitted sound
Analog echo signal
Threshold
Digital echo signal
Integrated time
Output signal
integrator Time of flight (sensor output)
threshold
Wave packet
Effective range: typically 12 cm to 5 m
4.1.6
Page 13
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Ultrasonic Sensor (time of flight, sound)
typically a frequency: 40 - 180 kHz generation of sound wave: piezo transducer
transmitter and receiver separated or not separated sound beam propagates in a cone like manner
opening angles around 20 to 40 degrees regions of constant depth segments of an arc (sphere for 3D)
Typical intensity distribution of a ultrasonic sensor
-30°
-60°
0°
30°
60°
Amplitude [dB]
measurement cone
4.1.6
Page 14
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
SRF10 sensor
Range: 3 cm to 6 m See also www.acroname.com
Page 15
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
SRF10 Characteristics
SONAR VALUES
0
200
400
600
800
1000
1200
0 20 40 60 80 100 120 140 160
Distance measured (in cm)
sen
sor
read
ing group1
group2
group3
group4
group5
group6
Page 16
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
SRF10 Characteristics (previous years)
Sonar Rangefinder
0
500
1000
1500
2000
2500
3000
0 50 100 150 200 250 300
Range (cm)
Sen
sor
Rea
din
g
Person
Cardboard
Metal
Wall
Legos
Page 17
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Ultrasonic Sensor Problems
Soft surfaces that absorb most of the sound energy Undesired from non-perpendicular surfaces
Specular reflection
Foreshortening
Cross-talk between sensors
4.1.6
What if the robot is moving or the sensor is moving (on a servo motor)? What if another robot with the same sensor is nearby?
Page 18
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Optical Triangulation (1D)
Principle of 1D triangulation.
distance is proportional to 1/x
Target
D
L
Laser / Collimated beam
Transmitted Beam
Reflected Beam
P
Position-Sensitive Device (PSD)or Linear Camera
x
Lens
x
LfD
x
LfD
4.1.6
http://www.acroname.com/robotics/parts/SharpGP2D12-15.pdf
Page 19
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Sharp Optical Rangefinder (aka ET sensor)
ET sensor values
0
20
40
60
80
100
120
140
160
0 20 40 60 80 100 120 140 160
Distance (in cm)
sen
sor
Rea
din
g group4
group6
group1
group2
group3
group5
Page 20
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Sharp Optical Rangefinder (previous years)
IR Optical Rangefinder
0
20
40
60
80
100
120
140
160
0 10 20 30 40 50 60 70 80
Range (cm)
Sen
sor
Rea
din
g Black
Blue
Red
Green
Page 21
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
IR Sensor (aka Top Hat sensor)
TopHat sensor
150
170
190
210
230
250
270
0 1 2 3 4 5 6 7
Group #
Sen
sor
Rea
din
g
Cardboard
Metal
Black Surface
White Paper
Wall
Used for:Used for:Line followingLine followingBarcode readerBarcode readerEncoderEncoder
Page 22
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Ground-Based Active and Passive Beacons
Elegant way to solve the localization problem in mobile robotics Beacons are signaling guiding devices with a precisely known position Beacon base navigation is used since the humans started to travel
Natural beacons (landmarks) like stars, mountains or the sun Artificial beacons like lighthouses
The recently introduced Global Positioning System (GPS) revolutionized modern navigation technology
Already one of the key sensors for outdoor mobile robotics For indoor robots GPS is not applicable,
Major drawback with the use of beacons in indoor: Beacons require changes in the environment
-> costly. Limit flexibility and adaptability to changing
environments.
4.1.5
Page 23
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Global Positioning System (GPS)
Developed for military use Recently it became accessible for commercial applications 24 satellites (including three spares) orbiting the earth every 12 hours at a
height of 20.190 km. Four satellites are located in each of six planes inclined 55 degrees with respect
to the plane of the earth’s equators Location of any GPS receiver is determined through a time of flight
measurement
Technical challenges: Time synchronization between the individual satellites and the GPS receiver Real time update of the exact location of the satellites Precise measurement of the time of flight Interferences with other signals
4.1.5
Page 24
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Global Positioning System (GPS)
4.1.5
Satellites synchronize transmissions of Satellites synchronize transmissions of location & current timelocation & current time
GPS receiver is GPS receiver is passivepassive
4 satellites provide (x,y,z) and 4 satellites provide (x,y,z) and time correctiontime correction
Page 25
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Laser Range Sensor (time of flight, electromagnetic) (1)
Transmitted and received beams coaxial Transmitter illuminates a target with a collimated beam Receiver detects the time needed for round-trip A mechanical mechanism with a mirror sweeps
2 or 3D measurement
PhaseMeasurement
Target
D
L
Transmitter
Transmitted BeamReflected Beam
P
4.1.6
Page 26
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Laser Range Sensor (time of flight, electromagnetic) (2)
Time of flight measurement Pulsed laser
measurement of elapsed time directly resolving picoseconds
Beat frequency between a frequency modulated continuous wave and its received reflection
Phase shift measurement to produce range estimation technically easier than the above two methods.
4.1.6
Page 27
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Laser Range Sensor (time of flight, electromagnetic) (3)
Phase-Shift Measurement
Wherec: is the speed of light; f is the modulating frequency; D’ is the total distance covered by the emitted light
for f = 5 Mhz (as in the A.T&T. sensor), = 60 meters
PhaseMeasurement
Target
D
L
Transmitter
Transmitted BeamReflected Beam
P
2
2 LDLD = c/f
4.1.6
Page 28
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Laser Range Sensor (time of flight, electromagnetic) (4)
Distance D, between the beam splitter and the target
where : phase difference between the transmitted and reflected light beams
Theoretically ambiguous range estimates since for example if = 60 meters, a target at a range of 5 meters = target at
65 meters
4D
Transmitted BeamReflected Beam
0
Phase [m]
Amplitude [V]
(2.33)
4.1.6
Page 29
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Laser Range Sensor (time of flight, electromagnetic) (5)
Confidence in the range (phase estimate) is inversely proportional to the square of the received signal amplitude.
Hence dark, distant objects will not produce such good range estimated as closer brighter objects …
4.1.6
Page 30
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Laser Range Sensor (time of flight, electromagnetic)
Typical range image of a 2D laser range sensor with a rotating mirror. The length of the lines through the measurement points indicate the uncertainties.
4.1.6
Page 31
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Vision-based Sensors: Sensing
Visual Range Sensors Depth from focus Stereo vision
Motion and Optical Flow
Color Tracking Sensors
4.1.8
Page 32
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Vision-based Sensors: Hardware
CCD (light-sensitive, discharging capacitors of 5 to 25 micron)
CMOS (Complementary Metal Oxide Semiconductor technology)
2048 x 2048 CCD array
Cannon IXUS 300
Sony DFW-X700
Orangemicro iBOT Firewire
4.1.8
Page 33
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Color Tracking Sensors
Motion estimation of ball and robot for soccer playing using color tracking
4.1.8
Page 34
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Robot Formations using Color Tracking
Page 35
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Image representation
(1,1)(1,1)
(640,480)(640,480)
R = (R = (255255,,00,,00))
G = (G = (00,,255255,,00))
B = (B = (00,,00,,255255))
Yellow = (Yellow = (255255,,255255,,00))
Magenta = (Magenta = (255255,,00,,255255))
Cyan = (Cyan = (00,,255255,,255255))
White = (White = (255255,,255255,,255255))
Page 36
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Image Representation
YCrCb YCrCb illumination data stored in a separate channelillumination data stored in a separate channel(may be more resistant to illumination changes)(may be more resistant to illumination changes)
R-G-B channels map to Cr-Y-CbR-G-B channels map to Cr-Y-CbwherewhereY = 0.59G + 0.31R + 0.11B (illumination)Y = 0.59G + 0.31R + 0.11B (illumination)Cr = R-Y Cr = R-Y Cb = B-YCb = B-Y
Page 37
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
CMU cam
Ubicom SX28 microcontroller with 136 byes SRAM 8-bit RGB or YCrCb Max resolution: 352 x 288 pixels Resolution is limited to 80 horizontal pixels x 143 vertical pixels
because of the line by every other line processing.
(1,1)(1,1)
(352,288)(352,288) (80,143)(80,143)
Page 38
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
CMU cam Operation
init_camera() auto-gain – adjusts the brightness level of the image white balance adjusts the gains of the color channels to accommodate
for non-pure white ambient light clamp_camera_yuv()
point the camera at a white surface under your typical lighting conditions and wait about 15 seconds
trackRaw(rmin, rmax, gmin, gmax, bmin, bmax)
GUI interface for capturing images and checking colors
Page 39
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
CMU cam Tracking
Global variablesGlobal variables•track_size … in pixelstrack_size … in pixels•track_xtrack_x•track_ytrack_y•track_area … area of the bounding boxtrack_area … area of the bounding box•track_confidencetrack_confidence
(1,1)(1,1)
(80,143)(80,143)
Page 40
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
CMU cam – Better tracking
Auto-gain Adjusts the brightness level of the image
White balance Adjusts the color gains on a frame by frame basis Aims for an average color of gray Works great until a solid color fills the image
One strategy – use CrYCb Aim at the desired target and look at a dumped frame (in GUI)Aim at the desired target and look at a dumped frame (in GUI) Set the Cr and Cb bounds from the frame dumpSet the Cr and Cb bounds from the frame dump Set a very relaxed Y (illumination)Set a very relaxed Y (illumination)
Page 41
Autonomous Mobile Robots, Chapter 4
© R. Siegwart, I. Nourbakhsh with Skubic augmentations
Adaptive Human-Motion Tracking
4.1.8