Top Banner
VisHap: Guangqi Ye, Jason J. Corso, Gregory D. Hager, Allison M. Okamura Presented By: Adelle C. Knight Augmented Reality Combining Haptics and Vision
23

VisHap:

Jan 01, 2016

Download

Documents

griffith-hogan

Augmented Reality Combining Haptics and Vision. VisHap:. Guangqi Ye, Jason J. Corso, Gregory D. Hager, Allison M. Okamura Presented By: Adelle C. Knight. Agenda. Introduction Design & Implementation Vision Subsystem World Subsystem Haptic Subsystem Experimental Results Future Work - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: VisHap:

VisHap:

Guangqi Ye, Jason J. Corso, Gregory D. Hager, Allison M. Okamura

Presented By: Adelle C. Knight

Augmented Reality Combining Haptics and Vision

Page 2: VisHap:

Agenda

• Introduction• Design & Implementation

– Vision Subsystem– World Subsystem– Haptic Subsystem

• Experimental Results• Future Work• Conclusions

Page 3: VisHap:

Design & Implementation

Page 4: VisHap:

Design & Implementation

• Pentium III PC• Linux OS• SRI Small Vision System (SVS)• SVS with STH-MDCS stereo head • PHANToM Premium 1.0A (SensAble

Technologies)

Page 5: VisHap:

Vision Subsystem

Page 6: VisHap:

Vision Subsystem

• Purpose: – track user’s finger– provide 3D information & video to world

• Appearance-based Hand Segmentation• Fingertip Detection and Tracking

• VisHap Implementation:– Assume user is interacting using a single finger– Perform finger tracking on the left color image– Compute 3D position of the finger in the

coordinate system of the left camera

Page 7: VisHap:

Vision Subsystem:Appearance-based hand segmentation

• Basic idea– split image into small tiles & build hue

histogram

• Start images– on-line learning procedure

• Future images– build histogram and carry out pair wise

histogram comparison with background model

Page 8: VisHap:

Vision Subsystem:Appearance-based hand segmentation

• Colour appearance model of human skins

• Collect training data

• Convert pixels from RGB to HSV colour space

• Learn single Gaussian model of hue distribution

• Perform check on foreground pixels (filter out non-skin points)

• Remove noise with median filter operation

background image foreground image segmentation result

Page 9: VisHap:

Vision Subsystem:Fingertip Detection & Tracking

• Detect finger by exploiting geometrical property

• Use cylinder with hemispherical cap to approximate shape of finger

• Radius of sphere corresponding to fingertip (r) is approximately proportional to reciprocal of depth of fingertip with respect to the camera (z)

• r = K/z

• Series of criteria checked on candidate fingertips to filter out false fingertips

Page 10: VisHap:

• Algorithm outputs multiple candidates around true location

• Select candidate with highest score to be the fingertip

• Kalman Filter to predict position of fingertip (real time tracking)

Vision Subsystem:Fingertip Detection & Tracking

Page 11: VisHap:

WorldSubsystem

Page 12: VisHap:

World Subsystem

• Purpose:– Perform 3D vision/haptics registration– Scene rendering– Notify haptic device about imminent interaction

• System calibration (SVS and PHANToM 1.0)

– Move haptic device around in field of view of camera

– Record more than 3 pairs of coordinates in camera and haptic frames

– Calculate optimal absolute orientation solution

Page 13: VisHap:

World Subsystem

• Implement Interaction Properties– Database of various interaction modes and

object surface properties

• Example:– Interaction with virtual wall

• Interaction property: slide along

– Interaction with button

• Interaction property: click

Page 14: VisHap:

HapticSubsystem

Page 15: VisHap:

Haptic Subsystem

• Purpose:– Simulate touching experience

• Presents suitable force feedback to fingertip

• Control scheme:– Control Law

– Gravity Compensation for PHANToM

– Interaction with Objects

Page 16: VisHap:

Haptic Subsystem:Control Law for Haptic Device

• Control Law based on error space to guide haptic device to target position

• Low pass filter to achieve smooth control and remove high frequency noise

or in time space

Page 17: VisHap:

• Motor torques needed to counteract the wrench applied to the manipulator

• Total torque caused by gravity of all parts of device

• Smooth and stable trajectory tracking

Haptic Subsystem:Gravity Compensation for PHANToM

Page 18: VisHap:

Haptic Subsystem:Interaction with Objects

• Simulate interaction forces by adjusting force gain according to object properties and interaction mode.

• Transform converts object’s gain matrix to that of haptic device:

• VisHap Implementation:– Defined OΛgain as a diagonal matrix with λx, λy , λz its

diagonal elements– Z-axis of objects frame is along normal of object’s

surface

Page 19: VisHap:

Haptic Subsystem:Interaction with Objects

• Example:– Object: button or keyboard key– Destination: center of button’s surface at initial contact– Enter object: user pushes button down

• increase λz to proper value to simulate button resistance

– Adjust destination point of haptic device to surface of bottom board of the button & increase λz to larger value

Relationship of force gain λz and depth d of fingertip under the surface of a button.

Page 20: VisHap:

• Foreground segmentation: used 1st 10 frames to learn appearance model of background

• Hue histograms of 8 bins for each 5 x 5 image patch

• Test algorithm: record image pairs of background and foreground

• Evaluate scheme: compare segmentation result and ground truth classification image

• Tested 26 pairs of images:– Average correct ratio: 98.16%– Average false positive ratio: 1.55%– False negative ratio: 0.29%

Experimental Results

Page 21: VisHap:

• Virtual Environment– virtual plane in space

• Interactions– user moves finger to interact with plane– user moves finger to press fixed button

• VisHap is capable of automatically switching interaction objects according to the scene configuration and current fingertip position.

Haptic force feedback along normal of object surface and the distance of the fingertip to the object.

Experimental Results

Page 22: VisHap:

Future Work

• Head mounted display (HMD)

– to achieve higher immersive ness and fidelity

• Extend virtual environment

– incorporate richer sets of objects and interaction modes

Page 23: VisHap:

Conclusions

• Generate “complete” haptic experience

• Modular framework: – computer vision– haptic device – augmented environment model

• Experimental results justify design

• Experimental results show flexibility and extensibility of framework