Practical Data Visualization and Virtual Reality Virtual Reality VR Interaction and Navigation Karljohan Lundin Palmerius
Practical Data Visualization and Virtual Reality
Virtual RealityVR Interaction and Navigation
Karljohan Lundin Palmerius
Synopsis
● Tracking● Navigation● Interaction● Mixing real and virtual (AR)
Tracking
● Head position (head tracking)– Parallax effects
– Correct 3D sound
– Natural Navigation● Moving around objects● HMD view
● Interaction– 3D mouse
– wand devices
Degrees of Freedom
● Tracked dimensionality– 3 DoF – position x, y, z
– 3 DoF – orientation (”yaw, pitch, roll”)
– 6 DoF – position and orientation
– More variations● 2 DoF, 5 DoF, 12 DoF, etc
● From lower to higher DoF– Two 3 DoF becomes 5 DoF
– Three 3 DoF becomes 6 DoF
Dead Reckoning
● Absolute tracking– grounded to the room
– calibrated to world coordinates
● Dead reckoning tracking– accelerometers and gyros
– know the acceleration and change of orientation
– integrate to get velocity and again to get position
– accumulates noise and integration errors
● Hybrid Tracking– combining to get the best of two worlds
Intersense● Hybrid tracking
– Ultrasound● slow but absolute● speakers and receivers● distances by time-of-flight● position by triangulation
– Accelerometers● fast but dead reckoning● accumulates errors over time
– integration error ( a → v → x )– noise and inaccuracy
Electromagnetic Trackers
● Magnetic field generator– Fast switch between magnetic fields
● Distributed detectors– Detect local shape of field
and compare the generated fields
Camera-based Tracking● Colour or IR cameras● Detect features
– Reflector balls – two cameras: 2D line into 3D position
– Face – search for skin colour and distribution
– Markers – 6 DoF from perspective maths
Characteristics
● Scalable– Cheap or expensive cameras
– More cameras for better tracking
– More or less complicated algorithms
● Issues– Frequency and latency (CPU expensive)
– Precision
Depth Camera Tracking
● Image-based tracking– like markerless tracking
– depth data makes things easier
– several approaches● active w/ structured light● active w/ time-of-flight● passive w/ image disparity
Depth Camera Characteristics
● Accessible– Cheap
– Available algorithms and implementations
– Need little tweaking
● Issues– structured light cameras are slow
– often low resolution because of CPU strain
– active approach is sensitive to bright light (IR)
– active approach cameras interfere each other
Consumer Grade Tracking
● Wii Mote● Playstation Move● PrimeSense
– depth camera (e.g. Kinect)
● LEAP motion● Myo
Mouse
● Magellan SpaceMouse– Control velocity not position
● Interaction– Mouse pointer
– Control objectr transform
– Navigate
● 6 DoF velocity control is hard to use– Reduce DoF or use major axis
Wand/Stylus
● Extension of 2D mouse– Real 3D position control
– Co-located with VR world
● 3–6 DoF in usable form– Hybrid tracking
– Mechanical tracking
● Interaction– Pointing device
– Touch, pick, grab
– Buttons
Implicit Navigation
● Immersive display– Should give us ‘real world’ interaction
– Walk to the object, turn around, etc
● Dependent on display and tracking technology– Caves and HMDs vs workbench and workstation
– Available space
– Locomotion interface Tracked volume
Screen
Explicit Navigation
● Select where to fly and how● Typical metaphors
– Fly/walk direction
– Goal driven navigation
– Object driven navigation
Fly/walk Direction
● Gaze direction– Natural feeling
– Can't watch scene go by
– Rare in VR, common in FPS
● Pointing mode– Point to desired position
– 5 DoF device tracking required
● Crosshair mode– Eye/hand line defines direction
Hand Controlled Speed
● Distance– Hand motion relative to initial point
– Dead zone is nice
● Used with crosshair control– Intuitive
– Natural mapping
– Limited range
Goal Driven Navigation
● Fly-to– Click on pre-defined ”book marks”
– Magic telescope
● Automatic navigation– Pre-selected ”good” path
● Virtual map– Select position you want to be at
– Type of widget
Object Driven Navigation
● Lifts, stairways, teleporters– Objects that transports you
– Instant or soft motion
● Exotic variations– Attractors, repellors
● Need to know where people want to go– Virtual architecture
Situation of Interaction
Tracked volume
Screen
Selection Calculations● Image space
– Don't use GL-based ”pick”● Not one image to work in● Generally not 2D
– Not always eye oriented
● VR world coordinates– Linear algebra
● Closest● Intersection
– Explicit handling of coordinates● Check your frame of reference!
Frames of Reference
● VR display system– graphics origin frame
– eye position
● Navigational system– graphics origin offset
– move objects relative the display
– move the display relative objects
T
TNAV
Scene
UI
Selection
● Close up– Touch metaphor
● Walk to object, reach out and touch● Impossible if object is behind the screen
– Encircle with line
– Select with pointer/sphere
● Far away– need extended reach
Extending Your Reach
● Pointer metaphor– Point at objects to select
● 5 DoF tracker required
– Issues in densly populated scenes● Select the closest● Occlusion
Extending Your Reach
● Push workspace– Move workspace when device is at its border
● Visual indication● Haptic indication
– Good for small workspace
– No co-located interaction
Extending Your Reach
● Mouse pointer on a stick– Extendable stick
– The point is the active pointer
– Control distance● Automatic, joystick, gestures
Extending Your Reach
● Typical Issues– No co-location
● Less intuitive● Less natural and effective● Possibly less precision
– Densly populated space● Find the right object● Hit the right object (occlusion)
– Transformed workspace● Never, ever rotate the workspace● Help user to track changes
Reality/Virtuality Continuum
● Mix between real and virtual world– ”Milgram's Continuum”
Real Environment Virtual EnvironmentMixed Reality
Augmented Reality Augmented Virtuality
Reality/Virtuality Continuum
● Mixed Reality– Anything with real and virtual components
● Augmented Virtuality– Virtual Reality augmented with live video feed
● Augmented Reality (AR)– Mostly real environment
– Augmented with computer graphics and VR
Augmented Reality
● Three key elements (Azuma)– combination of real and virtual reality
– real-time updates at interactive speed
– co-registration of real and virtual objects
Applications
● Annotation– Guided tours
– Sports
– Manufacturing and maintenance● Boeing - length and bundeling of cables● BMW - maintenance of engine
Applications
● Augmented Vision– ”X-ray vision” in medicine
– Building plans for smoke divers
UNC Chapel Hill
Applications
● Modifying Reality– Architectural or design modifications
– Virtual hairdressing / makeup
Applications
● Games– Augmented conventional games
– Monsters in reality
– Mark out real people as friends & foes
Special considerations
● Graphics– Need of fast updates
– Mixing of real and virtual visuals
● Tracking– High demands on precision, accuracy, latency
– Need of tracking of the world
Virtual/Real World Interaction
● Lighting, lamps, shadows, occlusion● Collision, co-location, manipulation