Top Banner
Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé [email protected] CMU NavLab Group
17

Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé [email protected]@andrew.cmu.edu CMU NavLab Group.

Dec 24, 2015

Download

Documents

Silvester Mason
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

Low Infrastructure Navigation for Indoor

EnvironmentsOctober 31, 2012

Arne Suppé [email protected]

CMU NavLab Group

Page 2: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

Overview

We have demonstrated camera based navigation of a vehicle in a parking garage

We propose to:Work with AUDI/VW to realistically demonstrate

algorithm and collect development dataProve robustness in wide variety of real-world

environments using actual automotive sensorsExplore solutions to reduce computational/data

footprint to levels realistic for a vehicle in 5 to 10 years

Page 3: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

Why Use Cameras for Navigation?

Cameras and computation are cheap and projected to get cheaper

3D LIDARs are large, expensive, and not likely to get cheap or rugged enough for automotive environment

High infrastructure costs to equip indoor environments with fiducials or beacons

Page 4: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

Cameras are Not Enough

Motion sensors:Have higher update rates and better incremental

precisionHandle cases where camera-based solutions failsConstrain solutions to make camera-based

navigation more tractable

Cameras contain drift in pose estimate

AUDI/VW’s expertise can help us to collect synchronized camera and real automotive motion sensor data to develop and benchmark our algorithms

Page 5: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

The Benefit of Combining Camera and Motion Sensors

Position Drift

Dro

p O

ut

Expensive Gyro AutomotiveGyro

Camera Navigation

Camera + Automotive Gyro ≈ Expensive IMU

Page 6: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

Building on Our Existing Work

Tailor system to take advantage of overhead environments.Known entrance and egress points to structuresDo not need to solve lost robot problem – only require

incremental solutionWe already do this when in EKF locked inSmaller search space than outdoor problem – can

employ stronger inference techniques

Page 7: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

Ceiling very invariant in overhead environmentsClassic result in indoor robot navigation. [Thrun 2000]

One camera instead of twoReduce physical costsReduce computational costs

Less data to process Potential to vastly simplify solution for camera motion

Alternative Camera Locations

Current Camera Locations

Alternative CameraLocation

Forward

Page 8: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

Alternative Algorithms

Explore alternative algorithms to measure camera motion to: Reduce computational cost for position refinement Reduce V2V and V2I communications requirements to transmit

map representations Improve robustness

www.123dapp.com www.photosynth.net

Page 9: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

The Virtual ValetPresented by Arne Suppé

With work by: Hernan Badino. Hideyuki Kume

Luis Navarro-Serment & Aaron Steinfeld

October 23, 2012

Page 10: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

Existing Vision Based Path Tracking

OfflineBuild location tagged image database recording

reference trajectoryLocations need only be locally consistent

OnlineReplay trajectory

1. Solve global localization problem

2. Refine position estimate

3. Fuse with vehicle sensors

Page 11: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

Building the Database

Use structure from motion to reconstruct a smooth trajectory of the camera through the environment.[Wu, 2011]

Feature points Camera poses

Page 12: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

Global Localization

Find relevant images in the database given new image Returns location of most similar database image

Whole image SURF descriptor – weak similarity metric

Topometric mapping [Badino 2012] We know which images should be near each other We know how fast the vehicle is moving

DatabaseDatabase CurrentCurrent

Dis

tan

ce

Filter State (Database Images in Traversal Sequence)

Log Probability of Vehicle Location as it Travels

1 2 3 4 5 6 7 8 9

Page 13: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

Position Refinement

Recover 6-DOF displacement between database and query image.Database location + displacement = current global

locationOnline process – uses GPU accelerated SIFT feature

matching and RANSAC homographyDatabaseDatabase QueryQuery

Database

Query

Page 14: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

Sensor Data Fusion Image matching solution may be noisy, wrong, not exist

EKF fuses camera data with cheap, automotive sensors Reduces noise while vision contains drift

Estimation used for vehicle control

Direction of Travel

Matched Database Image Location

Vehicle Position Covariance

Vehicle Position Estimate

InitializationLock-OnLoss of LockLock Reacquired

Global Localization

Global Localization

Position Refinement

Position Refinement

EKF FusionEKF Fusion

Position Tagged Database

Navigation Cameras

Position Refinement

Position Refinement

EKF FusionEKF Fusion

Global Position Information

Init

ializ

ati

on

Aft

er

Lock

-On

Page 15: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

Sensor Data Fusion

Page 16: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

Vehicle Platform

NavLab 11 - 2000 Jeep Wrangler Throttle, brake, and steering actuators Crossbow IMU, KVH Fiberoptic yaw gyro,

Odometry

Computing 5 Intel Core i7 M 620, 2 cores @ 2.67 GHz, 8

GB RAM Command & control, vehicle state, obstacle

detection, etc.

1 Intel Core i7-2600K, 4 cores @ 3.4 GHz, 16 GB RAM Nvidia GeForce GTX580 Fermi Structure from motion localization

Navigation Camera

Panorama Camera

Collision Warning LIDAR

Page 17: Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé suppe@andrew.cmu.edusuppe@andrew.cmu.edu CMU NavLab Group.

References

Probablistic Algorithms and the Interactive Museum Tour-Guide Robot MINERVA, S. Thrun, M. Beetz, M. Bennewitz, W. Burgard, A.B. Cremers, F. Dellaert, D. Fox, D. Haehnel, C. Rosenberg, N. Roy, J. Schulte, D. Schulz. International Journal of Robotics Resesarch, 2000.

VisualSFM : A Visual Structure from Motion System, Changchang Wu, http://www.cs.washington.edu/homes/ccwu/vsfm/

Real-Time Topometric Localization. Hernan Badino, Daniel Huber, Takeo Kanade. International Conference on Robotics and Automation, May 2012

Semi-Autonomous Virtual Valet Parking. Arne Suppe, Luis Navarro-Serment, Aaron Steinfeld. AutomotiveUI 2010