Intel® RealSense™ Technology | Intel® Software
Kevin Arthur, Senior User Experience Researcher, Intel Perceptual Computing Group
Augmented Reality with the Intel® RealSense™ SDK and R200 Camera
User Experience Best Practices
Presented at Augmented World Expo June 2015. Annotations added.
This is excerpted from a joint presentation with Meghana Rao, who discussed developer best practices.
Intel® RealSense™ Technology | Intel® Software 2
Outline
Part 1, Kevin
• Overview of R200 camera and tablet augmented reality use cases
• User experience guidelines highlights
Part 2, Meghana
• SDK overview
• Sample code and demosPart 2 omitted here.
Intel® RealSense™ Technology | Intel® Software 3
New R200 Depth Camera For Tablets, Peripheral Dev Kit Available Now
The R200 camera is an active-stereo depth camera that has longer range than the earlier F200 camera. The module consists of an RGB camera, two infrared cameras, and an infrared laser projector. The R200 camera is being integrated into tablets for “world-facing” uses.
Intel® RealSense™ Technology | Intel® Software 4
R200 View Volume and SDK Features
R200 with Intel® RealSense™ SDK
• Scene Perception Module, enables scene-aware AR
• Camera tracking and localization
• Mesh reconstruction
• Other Modules
• 3D capture
• Depth-enhanced photo and video
• Measurement
• Face detection and tracking
• Speech (Windows SDK only)
Intel® RealSense™ Technology | Intel® Software 5
Use Cases for R200 Mixed and Augmented Reality
Gaming and Play Education and Training Visualization
These are some example applications of tablet augmented reality using the R200 camera. All of these overlay digital content onto a camera view of a physical scene. Camera and SDK provide scene reconstruction and tracking in real-time.
Intel® RealSense™ Technology | Intel® Software 6
Video – ToyZ Game
Shows real-time scene perception for collision and occlusion (no pre-scan)
Try it at the Intel booth
By Shachar Oz, Omek Studio at Intel
This is a screenshot of a “magic-window” style game. The user sees a view of the real environment, with virtual content overlaid in place. They can drive a car, robot, or helicopter around the scene. Collisions, occlusion, and shadows are simulated between the real and virtual objects in real-time.
Intel® RealSense™ Technology | Intel® Software 7
Video – Procedural Island
Shows scan as part of “capture and play”
Illustrates procedural shaders and set dressing
By Eddy Ortega, Garrett Stevens, Perceptual Computing at Intel
This video illustrates how scanned scenes and objects can be transformed from just meshes into procedurally generated environments (grass, water, trees, and a statue are all placed around the scene based on rules).
Intel® RealSense™ Technology | Intel® Software
UX Guidelines for R200 Tablet ARDesigning real, usable apps for a mass market
The following are a few highlights from our user experience design guidelines for R200. These are insights from user studies, in which we’ve observed regular people using tablet AR apps and prototypes.
Our emphasis is on how to design applications that are not just novelty demos but are compelling and usable experiences that people will continue to use over long periods.
Intel® RealSense™ Technology | Intel® Software 9
Lesson 1: Give People a Reason to Move, or They Won’t
Tablet as window into a virtual space vs. tablet as fixed screen
Address with motion hints
The basic “magic window” concept, in which the computer-generated view changes dynamically based on the position of the camera/tablet, is completely foreign to most regular people (in contrast to more technical people). In fact, people typically won’t move the tablet around at all unless they’re guided to do so.
Intel® RealSense™ Technology | Intel® Software 10
Motion Hints – Explicit
Registered with scene
Or registered with window
Explicit instructions and feedback are the most direct way to design for this.
Intel® RealSense™ Technology | Intel® Software 11
Motion Hints – Implicit, Part of Experience
Lead the user with content
Example: “Windy Day” (Google Spotlight Stories)
If done carefully, it’s possible to teach users more indirectly about magic window experiences. In “Windy Day,” for Google devices, the hat flies off-screen and users are led to naturally move their device to follow it.
Intel® RealSense™ Technology | Intel® Software 12
Lesson 2: But Let People Relax Too
Support bothActive Camera Mode
• Tiring
Inactive Camera Mode
• Less tiringAt the same time, you can’t require your users to always be holding and moving the tablet or phone. It’s just too tiring for users to do this for more than a few minutes. So most apps should also support an “inactive-camera” mode with a fixed viewpoint.
Intel® RealSense™ Technology | Intel® Software 13
Active-Camera and Inactive-Camera Modes
Active Camera Mode
• Touch interaction less comfortable, less precise
Inactive Camera Mode
• Touch interaction more comfortable, more precise
Make the main controls easy to reach with the thumbs during Active Camera modes .
During Inactive Camera modes, placing controls elsewhere is acceptable.
AVOID OKOKOK
Make the main controls easy to reach with the thumbs during Active Camera modes .
During Inactive Camera modes, placing controls elsewhere is acceptable.
AVOID OKOKOK
Touch Zones Touch Zones
A second reason for supporting an inactive mode is that it’s awkward for users to touch the screen precisely while holding the device (especially kids and people with small hands). Let users do non-trivial touchscreen interaction in the inactive mode.
Intel® RealSense™ Technology | Intel® Software 14
Two Styles of Mixed-Reality Games
Augmented Reality Capture and Play
Active Camera
Inactive Camera
A third reason for supporting an inactive mode is that sometimes it’s simply not necessary to always see a live, registered augmented-reality view. Think of the active mode as capturing the live scene. This can work well in “capture and play” games, where each level might start with capture (active) and proceed with play (inactive).
Intel® RealSense™ Technology | Intel® Software 15
Lesson 3: Plan for the Scene
• Consider size of play space, and use appropriate voxel resolutionBe sure to understand the physical context of use – is your app designed for tabletop play or for whole-room visualization? This has implications for SDK parameters and for your overall design.
Intel® RealSense™ Technology | Intel® Software 16
Game Design Considerations
• What objects does the user need?
• Level design has more unknowns
• Enhance and transform everyday objects in interesting ways
Likewise, plan for any props that users might want to have on hand for capture-and-play games, or what types of things in the scene will make for interesting geometry. Test these ideas with real users, and provide clear instructions and feedback.
Intel® RealSense™ Technology | Intel® Software 17
Procedural Set Dressing This is a breakdown of the steps used in procedural set dressing to transform real scenes into more interesting virtual environments for play.
Intel® RealSense™ Technology | Intel® Software 18
Plan for the Scene
• Understand the camera limitations. Depth data is less accurate on
• Very bright areas
• Clear glass
• Black surfaces
• Give relevant feedback
• Fail gracefully, don’t prevent play
Understand the camera limitations and experiment with settings to get the best results for the environments you’re targeting.
Intel® RealSense™ Technology | Intel® Software 19
Resources
software.intel.com/realsense
software.intel.com/articles/realsense-ux-design-guidelines
[email protected], @karthur
Please see the R200 UX design guidelines at this link for more information.
Thanks: Rachel Kennison, Eddy Ortega, Garrett Stevens, Shachar Oz, Meghana Rao.