Top Banner
Billinghurst and Duh 1 Designing Augmented Reality Experiences Mark Billinghurst University of Canterbury Christchurch, New Zealand Henry B.L. Duh National University of Singapore Singapore, Singapore [email protected] http://chi2013.acm.org/ Copyright is held by Billinghurst & Duh CHI 2013, April 27–May 2, 2013, Paris, France. ACM 13/04
272

CHI 2013 DARE Course

Jan 27, 2015

Download

Technology

CHI 2013 Course on Designing Augmented Reality Experiences. Taught by Mark Billinghurst and Henry Duh at the CHI 2013 Conference - April, 2013
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: CHI 2013 DARE Course

Billinghurst and Duh 1

Designing Augmented Reality Experiences Mark Billinghurst

University of Canterbury Christchurch, New Zealand

Henry B.L. Duh National University of Singapore

Singapore, Singapore

[email protected] http://chi2013.acm.org/

Copyright is held by Billinghurst & Duh CHI 2013, April 27–May 2, 2013, Paris, France. ACM 13/04

Page 2: CHI 2013 DARE Course

Billinghurst and Duh 2

Introduction

Page 3: CHI 2013 DARE Course

Billinghurst and Duh 3

Instructors

 Mark Billinghurst •  Director of HIT Lab NZ, University of Canterbury •  Degrees in Electrical Engineering, Applied Mathematics •  Research on collaborative AR, mobile AR, AR usability •  More than 250 papers in AR, VR, interface design

 Henry Duh •  Co-director Keio-NUS Joint International Research (CUTE) Center •  Degrees in Psychology, Industrial design and Engineering •  Research on interaction design and AR applications •  More than 80 papers in HCI, AR and Design

Introduction

Page 4: CHI 2013 DARE Course

Billinghurst and Duh 4

How Would You Design This?

 Put nice AR Picture here – and video

Page 5: CHI 2013 DARE Course

Billinghurst and Duh 5

Or This?

Page 6: CHI 2013 DARE Course

Billinghurst and Duh 6

  How to design effective AR experiences   Understanding AR interaction design possibilities   Hardware and software tools for rapid prototyping of AR applications   Effective evaluation methods for AR applications   Current areas of AR research that will contribute to future AR experiences   Hands on experiences with AR applications   Resources for your own research

What You Will Learn

Introduction

Page 7: CHI 2013 DARE Course

Billinghurst and Duh 7

  Introduction [Mark]  AR and the Interaction Design Process [Mark]  Design Guidelines and Interaction Metaphors for AR [Mark]  AR Development/Prototyping Tools [Mark]  Afternoon Tea – Demos [Mark and Henry]  AR Evaluation Methods [Henry]  AR Design Case Studies [Henry]  AR Research Directions [Mark]

Course Agenda

Introduction

Page 8: CHI 2013 DARE Course

Billinghurst and Duh 8

Course Demos  AR Authoring

BuildAR, Metaio Creator

 AR Browers •  Junaio, Layar, Wikitude

 AR Gaming •  Elite CommandAR, Transformers, etc..

 Marker Based Handheld AR •  NASA and CCDU

 Outdoor AR •  CityViewAR

 Displays •  Vuzix, Google Glass

Page 9: CHI 2013 DARE Course

Billinghurst and Duh 9

Course Motivation

 AR Needs Good Interaction Design   AR increasingly popular but ergonomics, design and social

issues need to be addressed   There is a need for deeper understanding of how to uncover,

design build and evaluate effective AR experiences   AR authoring tools are making it easier than ever before to build

an AR experience, but there are few design guidelines   Many AR applications are being developed, but there is little

formal evaluation being conducted   AR experiences are being delivered without an understanding of

the interaction design/experience design process

Introduction

Page 10: CHI 2013 DARE Course

Billinghurst and Duh 10

What is Augmented Reality?

 Defining Characteristics (Azuma 97) •  Combines Real and Virtual Images

– Both can be seen at the same time •  Interactive in real-time

– The virtual content can be interacted with •  Registered in 3D

– Virtual objects appear fixed in space

Introduction

Azuma, R., A Survey of Augmented Reality, Presence, Vol. 6, No. 4, August 1997, pp. 355-385.

Page 11: CHI 2013 DARE Course

Billinghurst and Duh 11

From Science Fiction to Fact

1977 – Star Wars

2008 – CNN

Introduction

Page 12: CHI 2013 DARE Course

Billinghurst and Duh 12

AR Part of MR Continuum

Mixed Reality

Reality - Virtuality (RV) Continuum

Real Environment

Augmented Reality (AR)

Augmented Virtuality (AV)

Virtual Environment

"...anywhere between the extrema of the virtuality continuum."

P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.

Page 13: CHI 2013 DARE Course

Billinghurst and Duh 13

AR History

 1960’s – 80’s: Early Experimentation •  Military, Academic labs

 1980’s – 90’s: Basic Research •  Tracking, Displays

 1995 – 2005: Tools/Applications •  Interaction, Usability, Theory

 2005 - : Commercial Applications •  Games, Medical, Industry, Mobile

Introduction

Page 14: CHI 2013 DARE Course

Billinghurst and Duh 14

Core Technologies

 Combining Real and Virtual Images •  Display technologies

 Interactive in Real-Time •  Input and interactive technologies

 Registered in 3D •  Viewpoint tracking technologies

Introduction

Display

Processing

Input Tracking

Page 15: CHI 2013 DARE Course

Billinghurst and Duh 15

Display Technologies

 Types (Bimber/Raskar 2003)  Head attached

•  Head mounted display/projector  Body attached

•  Handheld display/projector  Spatial

•  Spatially aligned projector/monitor

 HMD Optical vs. Video see-through  Optical: Direct view of real world -> safer, simpler  Video: Video overlay -> more image registration options

Introduction

Page 16: CHI 2013 DARE Course

Billinghurst and Duh 16

Display Taxonomy

Page 17: CHI 2013 DARE Course

Billinghurst and Duh 17

Input Technologies

 Tangible objects •  Tracked items

 Touch (HHD) •  Glove, touch

 Gesture •  Glove, free-hand

 Speech/Multimodal  Device motion

•  HHD + sensors

Introduction

Page 18: CHI 2013 DARE Course

Billinghurst and Duh 18

Tracking Technologies

 Active •  Mechanical, Magnetic, Ultrasonic •  GPS, Wifi, cell location

 Passive •  Inertial sensors (compass, accelerometer, gyro) •  Computer Vision

•  Marker based •  Natural feature tracking

 Hybrid Tracking •  Combined sensors (eg Vision + Inertial)

Introduction

Page 19: CHI 2013 DARE Course

Billinghurst and Duh 19

 Web Based AR •  Flash, HTML 5 based AR •  Marketing, education

 Outdoor Mobile AR •  GPS, compass tracking •  Viewing Points of Interest in real world •  Eg: Junaio, Layar, Wikitude

 Handheld AR •  Vision based tracking •  Marketing, gaming

 Location Based Experiences •  HMD, fixed screens •  Museums, point of sale, advertising

Typical AR Experiences

Introduction

Page 20: CHI 2013 DARE Course

Billinghurst and Duh 20

AR Becoming Big Business

 Marketing •  Web-based, mobile

 Mobile AR •  Geo-located information and service •  Driving demand for high end phones

 Gaming •  Mobile, Physical input (Kinect)

 Upcoming areas •  Manufacturing, Medical, Military

 Rapid Growth •  Market projected to grow 53% 2012 – 2016 •  Over $5 Billion USD in Mobile AR alone by 2017

Page 21: CHI 2013 DARE Course

Billinghurst and Duh 21

Mobile AR Market Size

Page 22: CHI 2013 DARE Course

Billinghurst and Duh 22

Commercial AR Companies

 ARToolworks (http://www.artoolworks.com/) •  ARToolKit, FLARToolKit, SDKs

 Metaio (http://www.metaio.com/) •  Marketing, Industry, SDKs

 Total Immersion (http://www.t-immersion.com/) •  Marketing, Theme Parks, AR Experiences

 Qualcomm (http://developer.qualcomm.com/dev/augmented-reality) •  Mobile AR, Vuforia SDK

 Many small start-ups (String, Ogmento, etc)

Page 23: CHI 2013 DARE Course

Billinghurst and Duh 23

The Interaction Design Process

Page 24: CHI 2013 DARE Course

Billinghurst and Duh 24

“The product is no longer the basis of value. The

experience is.”

Venkat Ramaswamy The Future of Competition.

Interaction Design

Page 25: CHI 2013 DARE Course

Billinghurst and Duh 25

experiences

services

products

components

Valu

e

Gilmore + Pine: Experience Economy

Function

Emotion

Interaction Design

Page 26: CHI 2013 DARE Course

Billinghurst and Duh 26

experiences

applications

tools

components

Designing AR Experiences

Tracking, Display

Authoring

Interaction

Usability

Interaction Design

Page 27: CHI 2013 DARE Course

Billinghurst and Duh 27

The Value of Good User Experience

20c

50c

$3.50

Interaction Design

Page 28: CHI 2013 DARE Course

Billinghurst and Duh 28

Good Experience Design

 Reactrix •  Top down projection •  Camera based input •  Reactive Graphics •  No instructions •  No training

Interaction Design

Page 29: CHI 2013 DARE Course

Billinghurst and Duh 29

Apple: The Value of Good Design

 Good Experience Design Dominates Markets

iPod Sales 2002-2007

Page 30: CHI 2013 DARE Course

Billinghurst and Duh 30

Nokia N-Gage

 Great idea – bad experience design  See - http://www.sidetalkin.com

Good: Handheld Gaming + Phone Bad: Look like a dork using it

Page 31: CHI 2013 DARE Course

Billinghurst and Duh 31

Interaction Design

 Answering three questions: •  What do you do? - How do you affect the world? •  What do you feel? – What do you sense of the world? •  What do you know? – What do you learn?

 The Design of User Experience with Technology

“Designing interactive products to support people in their everyday and working lives”

Preece, J., (2002). Interaction Design

Interaction Design

Page 32: CHI 2013 DARE Course

Billinghurst and Duh 32

Interaction Design is All About You

 Users should be involved throughout the Design Process

 Consider all the needs of the user •  Especially context of use

Interaction Design

Page 33: CHI 2013 DARE Course

Billinghurst and Duh 33

Interaction Design Process

Interaction Design

Page 34: CHI 2013 DARE Course

Billinghurst and Duh 34

Gabbard Model for AR Design

1. user task analysis 2. expert guidelines-based evaluation 3. formative user-centered evaluation 4. summative comparative evaluations

Gabbard, J.L.; Swan, J.E.; , "Usability Engineering for Augmented Reality: Employing User-Based Studies to Inform Design,” Visualization and Computer Graphics, IEEE Transactions on, vol.14, no.3, pp.513-525, May-June 2008

Page 35: CHI 2013 DARE Course

Billinghurst and Duh 35

Gabbard Model in Context

Page 36: CHI 2013 DARE Course

Billinghurst and Duh 36

Design Guidelines for AR

Design Guidelines

Page 37: CHI 2013 DARE Course

Billinghurst and Duh 37

The Interaction Design Process

Page 38: CHI 2013 DARE Course

Billinghurst and Duh 38

AR Interaction Design

 Designing AR System = Interface Design •  Using different input and output technologies

 Objective is a high quality of user experience •  Ease of use and learning •  Performance and satisfaction

Page 39: CHI 2013 DARE Course

Billinghurst and Duh 39

Design Considerations

 Combining Real and Virtual Images •  Perceptual issues

 Interactive in Real-Time •  Interaction issues

 Registered in 3D •  Technology issues

Introduction

Page 40: CHI 2013 DARE Course

Billinghurst and Duh 40

 Interface Components •  Physical components • Display elements

– Visual/audio •  Interaction metaphors

Physical Elements

Display Elements Interaction

Metaphor Input Output

AR Design Elements

Page 41: CHI 2013 DARE Course

Billinghurst and Duh 41

AR UI Design

 Consider your user  Follow good HCI principles  Adapt HCI guidelines for AR  Design to device constraints  Using Design Patterns to Inform Design  Design for you interface metaphor  Design for evaluation

Page 42: CHI 2013 DARE Course

Billinghurst and Duh 42

Consider Your User

 Consider context of user •  Physical, social, emotional, cognitive, etc

 Mobile Phone AR User •  Probably Mobile •  One hand interaction •  Short application use •  Need to be able to multitask •  Use in outdoor or indoor environment •  Want to enhance interaction with real world

Page 43: CHI 2013 DARE Course

Billinghurst and Duh 43

Good HCI Principles

 Affordance  Reducing cognitive overload  Low physical effort  Learnability  User satisfaction  Flexibility in use  Responsiveness and feedback  Error tolerance

Page 44: CHI 2013 DARE Course

Billinghurst and Duh 44

Norman’s Principles of Good Practice •  Ensure a high degree of visibility

– allow the user to work out the current state of the system and the range of actions possible.

•  Provide feedback – continuous, clear information about the results of actions.

•  Present a good conceptual model – allow the user to build up a picture of the way the system

holds together, the relationships between its different parts and how to move from one state to the next.

•  Offer good mappings – aim for clear, natural relationships between actions the

user performs and the results they achieve.

Page 45: CHI 2013 DARE Course

Billinghurst and Duh 45

Adapting Existing Guidelines

 Mobile Phone AR •  Phone HCI Guidelines •  Mobile HCI Guidelines

 HMD Based AR •  3D User Interface Guidelines •  VR Interface Guidelines

 Desktop AR •  Desktop UI Guidelines

Page 46: CHI 2013 DARE Course

Billinghurst and Duh 46

iPhone Guidelines

 Make it obvious how to use your content.  Avoid clutter, unused blank space, and busy

backgrounds.  Minimize required user input.  Express essential information succinctly.  Provide a fingertip-sized target area for all links

and controls.  Avoid unnecessary interactivity.  Provide feedback when necessary

Page 47: CHI 2013 DARE Course

Billinghurst and Duh 47

Applying Principles to Mobile AR

 Clean  Large Video View  Large Icons  Text Overlay  Feedback

Page 48: CHI 2013 DARE Course

Billinghurst and Duh 48

AR vs. Non AR Design

 Design Guidelines •  Design for 3D graphics + Interaction •  Consider elements of physical world •  Support implicit interaction

Characteristics Non-AR Interfaces AR Interfaces Object Graphics Mainly 2D Mainly 3D

Object Types Mainly virtual objects Both virtual and physical objects

Object behaviors Mainly passive objects Both passive and active objects

Communication Mainly simple Mainly complex

HCI methods Mainly explicit Both explicit and implicit

Page 49: CHI 2013 DARE Course

Billinghurst and Duh 49

Maps vs. Junaio

 Google Maps •  2D, mouse driven, text/image heavy, exocentric

 Junaio •  3D, location driven, simple graphics, egocentric

Page 50: CHI 2013 DARE Course

Billinghurst and Duh 50

Design to Device Constraints

 Understand the platforms used and design for limitations •  Hardware, software platforms

 Eg Handheld AR game with visual tracking •  Use large screen icons •  Consider screen reflectivity •  Support one-hand interaction •  Consider the natural viewing angle •  Do not tire users out physically •  Do not encourage fast actions •  Keep at least one tracking surface in view

50

Art of Defense Game

Page 51: CHI 2013 DARE Course

Billinghurst and Duh 51

Handheld AR Constraints/Affordances   Camera and screen are linked

•  Fast motions a problem when looking at screen •  Intuitive “navigation”

  Phone in hand •  Two handed activities: awkward or intuitive •  Extended periods of holding phone tiring •  Awareness of surrounding environment

  Small screen •  Extended periods of looking at screen tiring •  In general, small awkward platform

  Vibration, sound •  Can provide feedback when looking elsewhere

  Networking - Bluetooth, 802.11 •  Collaboration possible

  Guaranteed minimum collection of buttons   Sensors often available

•  GPS, camera, accelerometer, compass, etc

Page 52: CHI 2013 DARE Course

Billinghurst and Duh 52

Design Patterns

“Each pattern describes a problem which occurs over and over again in our environment, and then describes the core of the solution to that problem in such a way that you can use this solution a million times over, without ever doing it the same way twice.”

– Christopher Alexander et al.

Use Design Patterns to Address Reoccurring Problems

C.A. Alexander, A Pattern Language, Oxford Univ. Press, New York, 1977.

Page 53: CHI 2013 DARE Course

Billinghurst and Duh 53

Handheld AR Design Patterns Title Meaning Embodied Skills Device Metaphors Using metaphor to suggest available player

actions Body A&S Naïve physics

Control Mapping Intuitive mapping between physical and digital objects

Body A&S Naïve physics

Seamful Design Making sense of and integrating the technological seams through game design

Body A&S

World Consistency Whether the laws and rules in physical world hold in digital world

Naïve physics Environmental A&S

Landmarks Reinforcing the connection between digital-physical space through landmarks

Environmental A&S

Personal Presence The way that a player is represented in the game decides how much they feel like living in the digital game world

Environmental A&S Naïve physics

Living Creatures Game characters that are responsive to physical, social events that mimic behaviours of living beings

Social A&S Body A&S

Body constraints Movement of one’s body position constrains another player’s action

Body A&S Social A&S

Hidden information The information that can be hidden and revealed can foster emergent social play

Social A&S Body A&S

Page 54: CHI 2013 DARE Course

Billinghurst and Duh 54

Example: Seamless Design

 Design to reduce seams in the user experience •  Eg: AR tracking failure, change in interaction mode

 Paparazzi Game •  Change between AR tracking to accelerometer input

Yan Xu , et.al. , Pre-patterns for designing embodied interactions in handheld augmented reality games, Proceedings of the 2011 IEEE International Symposium on Mixed and Augmented Reality--Arts, Media, and Humanities, p.19-28, October 26-29, 2011

Page 55: CHI 2013 DARE Course

Billinghurst and Duh 55

Example: Living Creatures

 Virtual creatures should respond to real world events •  eg. Player motion, wind, light, etc •  Creates illusion creatures are alive in the real world

 Sony EyePet •  Responds to player blowing on creature

55

Page 56: CHI 2013 DARE Course

Billinghurst and Duh 56

Physical Elements

Design Guidelines

Page 57: CHI 2013 DARE Course

Billinghurst and Duh 57

AR Design Space

Reality Virtual Reality

Augmented Reality

Physical Design Virtual Design

Page 58: CHI 2013 DARE Course

Billinghurst and Duh 58

Design of Objects

 Objects •  Purposely built – affordances •  “Found” – repurposed •  Existing – already at use in marketplace

 Affordance •  The quality of an object allowing an action-

relationship with an actor •  An attribute of an object that allows people to

know how to use it – e.g. a door handle affords pulling

Page 59: CHI 2013 DARE Course

Billinghurst and Duh 59

Norman on Affordances

"...the term affordance refers to the perceived and actual properties of the thing, primarily those fundamental properties that determine just how the thing could possibly be used. [...] Affordances provide strong clues to the operations of things. Plates are for pushing. Knobs are for turning. Slots are for inserting things into. Balls are for throwing .. " (Norman, The Psychology of Everyday Things 1988, p.9)

Page 60: CHI 2013 DARE Course

Billinghurst and Duh 60

Physical vs. Virtual Affordances

 Physical affordances -  Physical and material aspects of real object

 Virtual affordance -  Visual and perceived aspects of digital objects

 AR is mixture of physical and virtual affordances •  Physical

– Tangible controllers and objects •  Virtual

– Virtual graphics and audio

- 

Page 61: CHI 2013 DARE Course

Billinghurst and Duh 61

Affordance Framework

William W. Gaver. 1991. Technology affordances. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '91), Scott P. Robertson, Gary M. Olson, and Judith S. Olson (Eds.). ACM, New York, NY, USA, 79-84.

Page 62: CHI 2013 DARE Course

Billinghurst and Duh 62

Affordance Led Design

 Make affordances perceivable •  Provide visual, haptic, tactile, auditory cues

 Affordance Led Usability •  Give feedback •  Provide constraints •  Use natural mapping •  Use good cognitive model

Page 63: CHI 2013 DARE Course

Billinghurst and Duh 63

Example: AR Chemistry

 Tangible AR chemistry education (Fjeld) Fjeld, M., Juchli, P., and Voegtli, B. M. 2003. Chemistry education: A tangible

interaction approach. Proceedings of INTERACT 2003, September 1st -5th 2003, Zurich, Switzerland.

Page 64: CHI 2013 DARE Course

Billinghurst and Duh 64

Input Devices

 Form informs function and use

Page 65: CHI 2013 DARE Course

Billinghurst and Duh 65

Picking up an Atom

Page 66: CHI 2013 DARE Course

Billinghurst and Duh 66

AR Interaction Metaphors

Design Guidelines

Page 67: CHI 2013 DARE Course

Billinghurst and Duh 67

 Interface Components •  Physical components • Display elements

– Visual/audio •  Interaction metaphors

Physical Elements

Display Elements Interaction

Metaphor Input Output

AR Design Principles

Page 68: CHI 2013 DARE Course

Billinghurst and Duh 68

Interaction Tasks

 2D (from [Foley]): •  Selection, Text Entry, Quantify, Position

 3D (from [Bowman]): •  Navigation (Travel/Wayfinding) •  Selection •  Manipulation •  System Control/Data Input

 AR: 2D + 3D Tasks and.. more specific tasks?

[Foley] The Human Factors of Computer Graphics Interaction Techniques Foley, J. D., V. Wallace & P. Chan. IEEE Computer Graphics and Applications (Nov.): 13-48. 1984. [Bowman]: 3D User Interfaces: Theory and Practice D. Bowman, E. Kruijff, J. Laviola, I. Poupyrev Addison Wesley 2005

Page 69: CHI 2013 DARE Course

Billinghurst and Duh 69

AR Interaction Metaphors

 Viewpoint Control   Information Browsing

•  establish shared meaning

 3D AR Interfaces •  establish shared meaning

 Augmented Surfaces •  serve as cognitive artifacts

 Tangible AR •  serve as cognitive artifacts

Page 70: CHI 2013 DARE Course

Billinghurst and Duh 70

1. Viewpoint Control

 2D/3D virtual objects are registered in 3D •  “VR in Real World”

 Interaction •  2D/3D virtual viewpoint control

 Applications •  Visualization, training

Page 71: CHI 2013 DARE Course

Billinghurst and Duh 71

2. Information Browsering

 Information is registered to real-world context •  Hand held AR displays

 Interaction •  Manipulation of a window

into information space  Applications

•  Context-aware information displays

Rekimoto, et al. 1997

Page 72: CHI 2013 DARE Course

Billinghurst and Duh 72

3. 3D AR Interfaces

 Virtual objects displayed in 3D physical space and manipulated •  HMDs and 6DOF head-tracking •  6DOF hand trackers for input

 Interaction •  Viewpoint control •  Traditional 3D user interface

interaction: manipulation, selection, etc.

Kiyokawa, et al. 2000

Page 73: CHI 2013 DARE Course

Billinghurst and Duh 73

4. Augmented Surfaces

 Basic principles •  Virtual objects are projected on a surface •  Physical objects are used as controls for

virtual objects •  Support for collaboration

 Rekimoto, et al. 1998 •  Front projection •  Marker-based tracking •  Multiple projection surfaces

Page 74: CHI 2013 DARE Course

Billinghurst and Duh 74

5. Tangible User Interfaces

 Create digital shadows for physical objects

 Foreground •  graspable UI

 Background •  ambient interfaces

Page 75: CHI 2013 DARE Course

Billinghurst and Duh 75

Lessons from Tangible Interfaces

 Physical objects make us smart •  Norman’s “Things that Make Us Smart” •  encode affordances, constraints

 Objects aid collaboration •  establish shared meaning

 Objects increase understanding •  serve as cognitive artifacts

Page 76: CHI 2013 DARE Course

Billinghurst and Duh 76

TUI Limitations

 Difficult to change object properties •  Can’t tell state of digital data

 Limited display capabilities •  projection screen = 2D •  dependent on physical display surface

 Separation between object and display •  Augmented Surfaces

Page 77: CHI 2013 DARE Course

Billinghurst and Duh 77

Tangible AR Metaphor

 AR overcomes limitation of TUIs •  enhance display possibilities •  merge task/display space •  provide public and private views

 TUI + AR = Tangible AR •  Apply TUI methods to AR interface design

Page 78: CHI 2013 DARE Course

Billinghurst and Duh 78

 Space-multiplexed •  Many devices each with one function

– Quicker to use, more intuitive, clutter – Real Toolbox

 Time-multiplexed •  One device with many functions

–  Space efficient – mouse

Page 79: CHI 2013 DARE Course

Billinghurst and Duh 79

Tangible AR: Tiles (Space Multiplexed)

 Tiles semantics •  data tiles •  operation tiles

 Operation on tiles •  proximity •  spatial arrangements •  space-multiplexed

Page 80: CHI 2013 DARE Course

Billinghurst and Duh 80

Tangible AR: Time-multiplexed Interaction

 Use of natural physical object manipulations to control virtual objects

 VOMAR Demo •  Catalog book:

–  Turn over the page •  Paddle operation:

–  Push, shake, incline, hit, scoop

Page 81: CHI 2013 DARE Course

Billinghurst and Duh 81

Object Based Interaction: MagicCup

  Intuitive Virtual Object Manipulation on a Table-Top Workspace

•  Time multiplexed •  Multiple Markers

– Robust Tracking •  Tangible User Interface

–  Intuitive Manipulation •  Stereo Display

– Good Presence

Page 82: CHI 2013 DARE Course

Billinghurst and Duh 82

Page 83: CHI 2013 DARE Course

Billinghurst and Duh 83

Tangible AR Design Principles

 Tangible AR Interfaces use TUI principles •  Physical controllers for moving virtual content •  Support for spatial 3D interaction techniques •  Time and space multiplexed interaction •  Support for multi-handed interaction •  Match object affordances to task requirements •  Support parallel activity with multiple objects •  Allow collaboration between multiple users

Page 84: CHI 2013 DARE Course

Billinghurst and Duh 84

Interaction with Handheld AR

 Embodied Interaction •  Focuses on the device itself

•  Touch, gesture, orientation, etc  Tangible Interaction

•  Direct manipulation of known objects •  Tracking objects

 Egocentric vs. Exocentric Interaction •  Egocentric – inside out (eg outdoor AR browsing) •  Exocentric – outside in (eg marker based AR)

Page 85: CHI 2013 DARE Course

Billinghurst and Duh 85

Handheld AR Metaphors

HandHeld AR Wearable AR

Output: Display

Input

Input & Output

Page 86: CHI 2013 DARE Course

Billinghurst and Duh 86

Handheld Interface Metaphors

 Tangible AR Lens Viewing •  Look through screen into AR scene •  Interact with screen to interact with

AR content –  Eg Invisible Train

 Tangible AR Lens Manipulation •  Select AR object and attach to device •  Use the motion of the device as input

–  Eg AR Lego

Page 87: CHI 2013 DARE Course

Billinghurst and Duh 87

Case Study 1: 3D AR Lens

Goal: Develop a lens based AR interface

 MagicLenses •  Developed at Xerox PARC in 1993 •  View a region of the workspace differently to the rest •  Overlap MagicLenses to create composite effects

Page 88: CHI 2013 DARE Course

Billinghurst and Duh 88

3D MagicLenses

MagicLenses extended to 3D (Veiga et. al. 96)   Volumetric and flat lenses

Page 89: CHI 2013 DARE Course

Billinghurst and Duh 89

AR Lens Design Principles

 Physical Components •  Lens handle

–  Virtual lens attached to real object  Display Elements

•  Lens view – Reveal layers in dataset

  Interaction Metaphor •  Physically holding lens

Page 90: CHI 2013 DARE Course

Billinghurst and Duh 90

Case Study 2: LevelHead

 Physical Components •  Real blocks

 Display Elements •  Virtual person and rooms

  Interaction Metaphor •  Blocks are rooms

Page 91: CHI 2013 DARE Course

Billinghurst and Duh 91

AR Perceptual + Cognitive Issues

Design Guidelines

Page 92: CHI 2013 DARE Course

Billinghurst and Duh 92

AR and Perception

 Creating the illusion that virtual images are seamlessly part of the real world •  Must match real and virtual cues

•  Depth, occlusion, lighting, shadows..

Page 93: CHI 2013 DARE Course

Billinghurst and Duh 93

AR as Perception Problem

 Goal of AR to fool human senses – create illusion that real and virtual are merged

 Depth •  Size •  Occlusion •  Shadows •  Relative motion •  Etc..

Page 94: CHI 2013 DARE Course

Billinghurst and Duh 94

Central goal of AR systems is to fool the human perceptual system

 Display Modes •  Direct View •  Stereo Video •  Stereo graphics

 Multi-modal display •  Different objects with different display modes •  Potential for depth cue conflict

Perceptual Issues

D. Drascic and P. Milgram. Perceptual issues in augmented reality. In M. T. Bolas, S. S. Fisher, and J. O. Merritt, editors, SPIE Volume 2653: Stereoscopic Displays and Virtual Reality Systems III, pages 123-134, January/February 1996.

Page 95: CHI 2013 DARE Course

Billinghurst and Duh 95

Perceptual Issues

 Combining multiple display modes •  Direct View, Stereo Video View, Graphics View

 Conflict between display modes •  Mismatch between depth cues

Page 96: CHI 2013 DARE Course

Billinghurst and Duh 96

Perceptual Issues

 Static and Dynamic registration mismatch  Restricted Field of View  Mismatch of Resolution and Image clarity  Luminance mismatch  Contrast mismatch  Size and distance mismatch  Limited depth resolution  Vertical alignment mismatches  Viewpoint dependency mismatch

Page 97: CHI 2013 DARE Course

Billinghurst and Duh 97

Types of Perceptual Issues

 Environment: Issues related to the environment itself.  Capturing: Issues related to digitizing the environment  Augmentation: Issues related to the design, layout, and

registration or AR content  Display device: Technical issues associated with the

display device.  User: Issues associated with user perceiving content.

E. Kruijff, J. E. Swan, and S. Feiner. Perceptual issues in augmented reality revisited. 9th IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2010, pp. 3--12.

Page 98: CHI 2013 DARE Course

Billinghurst and Duh 98

Depth Cues

 Pictorial: visual cues •  Occlusion, texture, relative brightness

 Kinetic: motion cues •  Relative motion parallax, motion perspective

 Physiological: motion cues •  Convergence, accommodation

 Binocular disparity: two different eye images

Page 99: CHI 2013 DARE Course

Billinghurst and Duh 99

Page 100: CHI 2013 DARE Course

Billinghurst and Duh 100

Depth Perception

Page 101: CHI 2013 DARE Course

Billinghurst and Duh 101

Occlusion Handling

Page 102: CHI 2013 DARE Course

Billinghurst and Duh 102

Cognitive Issues in AR

 Three categories of issues •  Information Presentation – displaying virtual

information on the real world •  Physical Interaction – content creation,

manipulation and navigation in AR •  Shared Experience – collaboration and

supporting common experiences in AR

Li, Nai, and Henry Been-Lirn Duh. "Cognitive Issues in Mobile Augmented Reality: An Embodied Perspective." Human Factors in Augmented Reality Environments. Springer New York, 2013. 109-135.

Page 103: CHI 2013 DARE Course

Billinghurst and Duh 103

Information Presentation

 Information Presentation •  Amount of information

•  Clutter, complexity •  Representation of information

•  Navigation cues, POI representation •  Placement of information

•  Head, body, world stabilized •  View combination

•  Multiple views

Page 104: CHI 2013 DARE Course

Billinghurst and Duh 104

Twitter 360

 www.twitter-360.com   iPhone application  See geo-located tweets in real world  Twitter.com supports geo tagging

Page 105: CHI 2013 DARE Course

Billinghurst and Duh 105

Wikitude – www.mobilizy.com

Blah

Blah

Blah Blah Blah

Blah Blah

Blah

Blah Blah Blah

Blah Blah

Blah

Blah Blah

Blah

Blah Blah

Blah

Blah Blah

Page 106: CHI 2013 DARE Course

Billinghurst and Duh 106

Information Filtering

Page 107: CHI 2013 DARE Course

Billinghurst and Duh 107

Information Filtering

Page 108: CHI 2013 DARE Course

Billinghurst and Duh 108

Physical Interaction

 Physical Interaction •  Navigation •  Direct Manipulation

•  Embodied vs. Tangible •  Multimodal interaction •  Content creation

Page 109: CHI 2013 DARE Course

Billinghurst and Duh 109

Outdoor AR: Limited FOV

Page 110: CHI 2013 DARE Course

Billinghurst and Duh 110

Possible solutions

 Overview + Detail •  spatial separation; two views

 Focus + Context •  merges both views into one view

 Zooming •  temporal separation

Page 111: CHI 2013 DARE Course

Billinghurst and Duh 111

 TU Graz – HIT Lab NZ - collaboration •  Zooming panorama •  Zooming Map

Zooming Views

Page 112: CHI 2013 DARE Course

Billinghurst and Duh 112

Gesture Based Interaction

 HMD-based AR frees the users hands •  Natural hand based interaction •  Intuitive manipulation – low cognitive load

 Example •  Tinmith-Hand Two hand manipulation of 3D models

112

Page 113: CHI 2013 DARE Course

Billinghurst and Duh 113

Shared Experiences

 Shared Experience •  Social context •  Bodily configuration •  Artifact manipulation •  Display space

Page 114: CHI 2013 DARE Course

Billinghurst and Duh 114

TAT Augmented ID

Page 115: CHI 2013 DARE Course

Billinghurst and Duh 115

Page 116: CHI 2013 DARE Course

Billinghurst and Duh 116

Page 117: CHI 2013 DARE Course

Billinghurst and Duh 117

Designing for Children

 Development Psychology Factors •  Motor Abilities •  Spatial Abilities •  Logic Abilities •  Attention Abilities

Radu, Iulian, and Blair MacIntyre. "Using children's developmental psychology to guide augmented-reality design and usability." Mixed and Augmented Reality (ISMAR), 2012 IEEE International Symposium on. IEEE, 2012.

Page 118: CHI 2013 DARE Course

Billinghurst and Duh 118

Motor Abilities

Skill Type Challenging AR Interaction Multiple hand coordination Holding phone in one hand and

using another hand to move marker

Hand-eye coordination Using a marker to intercept a moving object

Fine motor skills Moving a marker on a specified path

Gross motor skills and endurance Turning body around to look at a panorama

Page 119: CHI 2013 DARE Course

Billinghurst and Duh 119

Spatial Abilities

Skill Type Challenging AR Interaction Spatial memory Remembering the configuration of a large

virtual space while handheld screen shows a limited view

Spatial Perception Understanding when a virtual item is on top of a physical item

Spatial Visualization Predict when virtual objects are visible by other people or virtual characters

Page 120: CHI 2013 DARE Course

Billinghurst and Duh 120

Attention and Logic

Skill Type Challenging AR Interaction Divided attention Playing an AR game, and making sure to

keep marker in view so tracking is not lost

Selective and executive attention

Playing an AR game while moving outdoors

Skill Type Challenging AR Interaction Remembering and reversing Remembering how to recover from tracking

loss

Abstract over concrete thinking

Understanding that virtual objects are computer generated, and they do not need to obey physical laws

Attention Abilities

Logic and Memory

Page 121: CHI 2013 DARE Course

Billinghurst and Duh 121 121

AR Development Tools

Page 122: CHI 2013 DARE Course

Billinghurst and Duh 122

AR Authoring Tools

 Low Level Software Libraries •  osgART, Studierstube, MXRToolKit

 Plug-ins to existing software •  DART (Macromedia Director), mARx, Unity,

 Stand Alone •  AMIRE, BuildAR, Metaio Creator etc

 Rapid Prototyping Tools •  Flash, OpenFrameworks, Processing, Arduino, etc

 Next Generation •  iaTAR (Tangible AR)

Page 123: CHI 2013 DARE Course

Billinghurst and Duh 123

ARToolKit (Kato 1998)

 Open source – computer vision based AR tracking  http://artoolkit.sourceforge.net/

Page 124: CHI 2013 DARE Course

Billinghurst and Duh 124

ARToolKit Structure

 Three key libraries: •  AR32.lib – ARToolKit image processing functions •  ARgsub32.lib – ARToolKit graphics functions •  ARvideo.lib – DirectShow video capture class

DirectShow

ARvideo.lib

Page 125: CHI 2013 DARE Course

Billinghurst and Duh 125

Software

 Cross platform •  Windows, Mac, Linux, IRIX, Symbian, iPhone, etc

 Additional basic libraries •  Video capture library (Video4Linux, VisionSDK) •  OpenGL, GLUT

 Requires a rendering library •  Open VRML, Open Inventor, osgART, etc

Page 126: CHI 2013 DARE Course

Billinghurst and Duh 126

OSGART Programming Library

  Integration of ARToolKit with a High-Level Rendering Engine (OpenSceneGraph) OSGART= OpenSceneGraph + ARToolKit

 Supporting Geometric + Photometric Registration

Page 127: CHI 2013 DARE Course

Billinghurst and Duh 127

osgART Approach: AR Scene Graph

Video Geode

Root

Transform

3D Object

Virtual Camera

Projection matrix from tracker calibration

Transformation matrix updated from marker tracking in realtime

Video Layer

Full-screen quad with live texture updated from Video source

Orthographic projection

Page 128: CHI 2013 DARE Course

Billinghurst and Duh 128

osgART:Features

 C++ (but also Python, Lua, etc).  Multiple Video Input supports:

•  Direct (Firewire/USB Camera), Files, Network by ARvideo, PtGrey, CVCam, VideoWrapper, etc.

 Benefits of Open Scene Graph •  Rendering Engine, Plug-ins, etc

Page 129: CHI 2013 DARE Course

Billinghurst and Duh 129

ARToolKit Family

ARToolKit ARToolKit NFT

ARToolKit (Symbian)

NyToolKit - Java, C#, - Android, WM

JARToolKit (Java)

FLARToolKit (Flash)

FLARManager (Flash)

Page 130: CHI 2013 DARE Course

Billinghurst and Duh 130

Why Browser Based AR?

 High impact •  High marketing value

 Large potential install base •  1.6 Billion web users

 Ease of development •  Lots of developers, mature tools

 Low cost of entry •  Browser, web camera

Page 131: CHI 2013 DARE Course

Billinghurst and Duh 131

FLARToolkit

Papervision 3D

Adobe Flash

AR Application Components

Page 132: CHI 2013 DARE Course

Billinghurst and Duh 132

FLARToolKit Example

 Boffswana Living Sasquatch  In first month

•  100K unique visits •  500K page views •  6 minutes on page

Page 133: CHI 2013 DARE Course

Billinghurst and Duh 133

Low Level Mobile AR Tools

 Vuforia Tracking Library (Qualcomm) •  Vuforia.com •  iOS, Android •  Computer vision based tracking •  Marker tracking, 3D objects, frame markers

 Integration with Unity •  Interaction, model loading, game logic

Page 134: CHI 2013 DARE Course

Billinghurst and Duh 134

Junaio - www.junaio.com

Page 135: CHI 2013 DARE Course

Billinghurst and Duh 135

Junaio Key Features

 Content provided in information channels •  Over 2,000 channels available

 Two types of AR channels •  GLUE channels – visual tracking •  Location based channels – GPS, compass tracking

 Simple to use interface with multiple views •  List, map, AR (live) view

 Point of Interest (POI) based •  POIs are geo-located content

Page 136: CHI 2013 DARE Course

Billinghurst and Duh 136

Page 137: CHI 2013 DARE Course

Billinghurst and Duh 137

AREL

 Augmented Reality Environment Language •  Overcomes limitations of XML by itself •  Based on web technologies; XML, HTML5, JavaScript

 Core Components 1. AREL XML: Static file, specifies scene content 2. AREL JavaScript: Handles all interactions and animation. Any

user interaction send an event to AREL JS 3. AREL HTML5: GUI Elements. Buttons, icons, etc

 Advantages •  Scripting on device, more functionality, GUI customization

Page 138: CHI 2013 DARE Course

Billinghurst and Duh 138

Page 139: CHI 2013 DARE Course

Billinghurst and Duh 139

Page 140: CHI 2013 DARE Course

Billinghurst and Duh 140

Page 141: CHI 2013 DARE Course

Billinghurst and Duh 141

Result

Page 142: CHI 2013 DARE Course

Billinghurst and Duh 142

BirdsView

 Location Based CMS •  Add content, publish to Layar or Junaio •  http://www.birdsview.de/

Page 143: CHI 2013 DARE Course

Billinghurst and Duh 143

BirdsView on Junaio

Page 144: CHI 2013 DARE Course

Billinghurst and Duh 144

BirdsView on Junaio

Page 145: CHI 2013 DARE Course

Billinghurst and Duh 145

BuildAR

 http://www.buildar.co.nz/  Stand alone application  Visual interface for AR model viewing application  Enables non-programmers to build AR scenes

Page 146: CHI 2013 DARE Course

Billinghurst and Duh 146

Metaio Creator

 Drag and drop Junaio authoring

Page 147: CHI 2013 DARE Course

Billinghurst and Duh 147

Total Immersion D’Fusion Studio

 Complete commercial authoring platform •  http://www.t-immersion.com/ •  Multi-platform •  Markerless tracking •  Scripting •  Face tracking •  Finger tracking •  Kinect support

Page 148: CHI 2013 DARE Course

Billinghurst and Duh 148

Others

 AR-Media •  http://www.inglobetechnologies.com/ •  Google sketch-up plug-in

 LinceoVR •  http://linceovr.seac02.it/ •  AR/VR authoring package

 Libraries •  JARToolKit, MXRToolKit, ARLib, Goblin XNA

Page 149: CHI 2013 DARE Course

Billinghurst and Duh 149

Research in AR Authoring

  iaTAR (Lee 2004) •  Immersive AR Authoring •  Using real objects to create AR applications

Page 150: CHI 2013 DARE Course

Billinghurst and Duh 150

Rapid Prototyping

 Speed development time by using quick hardware mockups •  handheld device connected to PC •  LCD screen •  USB phone keypad •  Camera

 Can use PC development tools for rapid application

Page 151: CHI 2013 DARE Course

Billinghurst and Duh 151

Build Your Own Google Glass

 Rapid Prototype Glass-Like HMD  Myvu HMD + headphone + iOS Device + basic glue skills

•  $300 + less than 3 hours construction   http://www.instructables.com/id/DIY-Google-Glasses-AKA-the-Beady-i/

Page 152: CHI 2013 DARE Course

Billinghurst and Duh 152

Page 153: CHI 2013 DARE Course

Billinghurst and Duh 153

BUNRATTY FOLK PARK

 Irish visitor attraction run by Shannon Heritage

 19th century life is recreated

 Buildings from the mid-west have been relocated to the 26-land surrounding Bunratty Castle

 30 buildings are set in a rural or village setting there.

Page 154: CHI 2013 DARE Course

Billinghurst and Duh 154

AUGMENTED REALITY

154

In Bunratty Folk Park:  Allows the visitor to point a camera at an exhibit, the

device recognises its by it’s location and layers digital information on to the display

 3- dimensional virtual objects can be positioned with real ones on display

 Leads to dynamic combination of a live camera view and information

Page 155: CHI 2013 DARE Course

Billinghurst and Duh 155

ITERATIVE DESIGN PROCESS

Prototyping and User Testing  Low Fidelity Prototyping

• Sketches • Paper Prototyping • Post-It Prototyping • PowerPoint Prototyping

 High Fidelity Prototyping • Wikitude

Page 156: CHI 2013 DARE Course

Billinghurst and Duh 156

Storyboarding

156

Page 157: CHI 2013 DARE Course

Billinghurst and Duh 157

INITIAL SKETCHES

Pros:  •   Good  for  idea  genera/on  •   Cheap  •   Concepts  seem  feasible  

Cons:  •   Not  great  feedback  gained  •   Photoshop  not  fast  enough  for  making  changes  

Page 158: CHI 2013 DARE Course

Billinghurst and Duh 158

Post-it Note Prototyping Camera  View  with  3D  Annota/on  

•   Selec/on  highlighted  in  blue   •   Home  buBon  added  for  easy  naviga/on  to  main  menu  

Page 159: CHI 2013 DARE Course

Billinghurst and Duh 159

POWERPOINT PROTOTYPING

Benefits    •   Used  for  User  Tes/ng  •   Interac/ve  •   Func/onali/es  work  •   Quick  •   Easy  arrangement  of  slides  

User  Tes/ng  •   Par/cipants  found  •   15  minute  sessions  screen  captured  

•   ‘Talk  Allowed’  technique  used    •   Notes  taken  •   Post-­‐Interview  

Page 160: CHI 2013 DARE Course

Billinghurst and Duh 160

WIKITUDE PROTOTYPE

User Testing  Application well received  Understandable  Participants playful with the

technology

Page 161: CHI 2013 DARE Course

Billinghurst and Duh 161

FINAL VIDEO PROTOTYPE

 Flexible  tool  for  capturing  the  use  of  an  interface  

 Elaborate  simula/on  of  how  the  naviga/onal  aid  will  work  

 Does  not  need  to  be  realis/c  in  every  detail  

 Gives  a  good  idea  of  how  the  finished  system  will  work  

Page 162: CHI 2013 DARE Course

Billinghurst and Duh 162 162

AR Evaluation Methods

Page 163: CHI 2013 DARE Course

Billinghurst and Duh 163

The Interaction Design Process

Page 164: CHI 2013 DARE Course

Billinghurst and Duh 164

Why Evaluate AR Applications?

 To test and compare interfaces, new technologies, interaction techniques

 To validate the efficiency and efficient the AR interface and system

 Test Usability (learnability, efficiency, satisfaction,...)  Get user feedback  Refine interface design  Better understand your end users   ...

Page 165: CHI 2013 DARE Course

Billinghurst and Duh 165

Survey of AR Papers  Edward Swan (2005)  Surveyed major conference/journals (1992-2004)

– Presence, ISMAR, ISWC, IEEE VR  Summary

•  1104 total papers •  266 AR papers •  38 AR HCI papers (Interaction) •  21 AR user studies

 Only 21 from 266 AR papers had a formal user study •  Less than 8% of all AR papers

Page 166: CHI 2013 DARE Course

Billinghurst and Duh 166

HIT Lab NZ Usability Survey

  A Survey of Evaluation Techniques Used in Augmented Reality Studies •  Andreas Dünser, Raphaël Grasset, Mark

Billinghurst

 reviewed publications from 1993 to 2007 •  Extracted 6071 papers which mentioned

“Augmented Reality” •  Searched to find 165 AR papers with User Studies

Page 167: CHI 2013 DARE Course

Billinghurst and Duh 167

Page 168: CHI 2013 DARE Course

Billinghurst and Duh 168

Page 169: CHI 2013 DARE Course

Billinghurst and Duh 169

Types of Experimental Measures Used

 Types of Experimental Measures •  Objective measures •  Subjective measures •  Qualitative analysis •  Usability evaluation techniques •  Informal evaluations

Page 170: CHI 2013 DARE Course

Billinghurst and Duh 170

Types of Experimental Measures Used

Page 171: CHI 2013 DARE Course

Billinghurst and Duh 171

Types of Experiments and topics

  Sensation, Perception & Cognition •  How is virtual content perceived ? •  What perceptual cues are most important ? •  How to visualize augmented/overlay information on real environment? •  Visual search/attention/salience issues of human performance

  Interaction •  How can users interact with virtual content ? •  Which interaction techniques are most efficient in certain context ?

  Collaboration & Social issues •  How is collaboration in AR interface different ? •  Which collaborative cues can be conveyed best ? •  Privacy and security issues of AR interface

Page 172: CHI 2013 DARE Course

Billinghurst and Duh 172

Types of AR User Studies

Page 173: CHI 2013 DARE Course

Billinghurst and Duh 173

Summary

 Over last 10 years •  Most user studies focused on user performance •  Fewest user studies on collaboration

– MobileAR was not popular before 2009 •  Objective performance measures most used •  Qualitative and usability measures least used

Page 174: CHI 2013 DARE Course

Billinghurst and Duh 174

Sample Size

 … the more the better •  for quantitative analysis:

•  rule of thumb approx. 15-20 or more (for cognitive and lab type of experiment)

•  absolute minimum of 8-10 per cell

 Ideal sample size can be calculated - power analysis •  Power (1- beta) => the chance to reject the null hypothesis

when the null hypothesis is false •  Power is the probability of observing a difference when it really

exists •  Power increases with sample size •  Power decreases with variance

 Large effects can be detected with smaller samples •  e.g. to discriminate mean speed between turtles and a rabbits

Page 175: CHI 2013 DARE Course

Billinghurst and Duh 175

Data Collection and Analysis

 The choice of a method is dependent on the type of data that needs to be collected

  In order to test a hypothesis the data has to be analysed using a statistical method

 The choice of a statistical method depends on the type of collected data

 All the decisions about an experiment should be made before it is carried out

Page 176: CHI 2013 DARE Course

Billinghurst and Duh 176

Observe and Measure

 Observations are gathered… •  manually (human observers) •  automatically (computers, software, cameras, sensors, etc.)

 A measurement is a recorded observation  Objective metrics  Subjective metrics

Page 177: CHI 2013 DARE Course

Billinghurst and Duh 177

Typical objective metrics

 task completion time  errors (number, percent,…)  percent of task completed  ratio of successes to failures  number of repetitions  number of commands used  number of failed commands  physiological data (heart rate,…)  …

Page 178: CHI 2013 DARE Course

Billinghurst and Duh 178

Typical subjective metrics

 user satisfaction  subjective performance  ratings  ease of use   intuitiveness   judgments  …

Page 179: CHI 2013 DARE Course

Billinghurst and Duh 179

Data Types

 Subjective •  Subjective survey

–  Likert Scale, condition rankings

•  Observations –  Think Aloud

•  Interview responses

 Objective •  Performance measures

–  Time, accuracy, errors

•  Process measures –  Video/audio analysis

How easy was the task

1 2 3 4 5 Not very easy Very easy

Page 180: CHI 2013 DARE Course

Billinghurst and Duh 180

Experimental Measures Measure What does it tell us? How is it

measured? Timings Performance Via a stopwatch, or

automatically by the device. Errors Performance, Particular sticking

points in a task By success in completing the task correctly. Through experimenter observation, examining the route walked.

Perceived Workload Effort invested. User satisfaction Through NASA TLX scales and other questionnaires.

Distance traveled and route taken

Depending on the application, these can be used to pinpoint errors and to indicate performance

Using a pedometer, GPS or other location-sensing system. By experimenter observation.

Percentage preferred walking speed

Performance By finding average walking speed, which is compared with normal walking speed.

Comfort User satisfaction. Device acceptability

Comfort Rating Scale and other questionnaires.

User comments and preferences

User satisfaction and preferences. Particular sticking points in a task.

Through questionnaires, interviews and think-alouds.

Experimenter observations

Different aspects, depending on the experimenter and on the observations

Through observation and note-taking

Page 181: CHI 2013 DARE Course

Billinghurst and Duh 181

Statistical Analysis

 Once data is collected statistics can be used for analysis  Typical Statistical Techniques

•  Comparing between two results –  Unpaired T-Test (for between subjects – assumes normal distribution) –  Paired T-Test (for within subjects – assumes normal distribution) –  Mann–Whitney U (independent samples)

•  Comparing between > two results –  Followed by post-hoc analysis – Bonferroni Test –  Analysis of Variance – ANOVA –  Kruskal–Wallis (does not assume normal distribution)

Page 182: CHI 2013 DARE Course

Billinghurst and Duh 182

Case Study: A Wearable Information Space

Head Stabilized Body Stabilized

An AR interface provides spatial audio and visual cues Does a spatial interface aid performance?

– Task time / accuracy

M. Billinghurst, J. Bowskill, Nick Dyer, Jason Morphett (1998). An Evaluation of Wearable Information Spaces. Proc. Virtual Reality Annual International Symposium.

Page 183: CHI 2013 DARE Course

Billinghurst and Duh 183

Task Performance

 Task •  find target icons on 8 pages •  remember information space

 Conditions A - head-stabilized pages B - cylindrical display with trackball C - cylindrical display with head tracking

 Subjects •  Within subjects (need fewer subjects) •  12 subjects used

Page 184: CHI 2013 DARE Course

Billinghurst and Duh 184

Experimental Measures

 Objective •  spatial ability (pre-test) •  time to perform task •  information recall •  workload (NASA TLX)

 Subjective •  Post Experiment Survey

–  rank conditions (forced choice) –  Likert Scale Questions

-  “How intuitive was the interface to use?”

Many Different Measures

Page 185: CHI 2013 DARE Course

Billinghurst and Duh 185

Post Experiment Survey

For each of these conditions please answer: 1) How easy was it to find the target? 1 2 3 4 5 6 7 1=not very easy 7=very easy

For the head stabilised condition (A): For the cylindrical condition with mouse input (B): For the head tracked condition (C):

Rank all the conditions in order on a scale of one to three 1) Which condition was easiest to find target (1 = easiest, 3 = hardest)

A: B: C:

Page 186: CHI 2013 DARE Course

Billinghurst and Duh 186

Results

 Body Stabilization Improved Performance •  search times significantly faster (One factor ANOVA)

 Head Tracking Improved Information recall •  no difference between trackball and stack case

 Head tracking involved more physical work

Page 187: CHI 2013 DARE Course

Billinghurst and Duh 187

Subjective Impressions

  Subjects Felt Spatialized Conditions (ANOVA): •  More enjoyable •  Easier to find target

Page 188: CHI 2013 DARE Course

Billinghurst and Duh 188

Subjective Impressions

  Subject Rankings (Kruskal-Wallis) •  Spatialized easier to use than head stabilized •  Body stabilized gave better understanding •  Head tracking most intuitive

Page 189: CHI 2013 DARE Course

Billinghurst and Duh 189

AR Evaluation  Field, Field, Field –

•  Field studies vs. Lab studies •  Contextual design and evaluation

 Combined methods (qualitative and quantitative studies) •  Weakness of each method should be considered

 New/modified evaluation methods may need to be developed

 Seek for more new evaluation case studies in AR

Page 190: CHI 2013 DARE Course

Billinghurst and Duh 190 190

AR Design Case Study

Page 191: CHI 2013 DARE Course

Billinghurst and Duh 191

“The Jackson Plan” An Educational Location-based Handheld AR Game

Learning while in travel

Mobile AR Entertainment for Children

Page 192: CHI 2013 DARE Course

Billinghurst and Duh 192

The Jackson Plan

 Overview

‘The Jackson Plan’ is an educational discovery Mobile Augmented Reality game that is set on the historical urban plan of the same name (also known as the “Plan of the Town of Singapore”)

Using multi-modality features on an Apple iPad2, players collaboratively experience this location-based Mobile Augmented Reality game around the several important historical sites and events that revolve around Sir Thomas Stamford Raffles and his founding of the island of Singapore in 1819.

The Jackson Plan 1822, is on display at the Singapore History Gallery, National Museum of Singapore

Page 193: CHI 2013 DARE Course

Billinghurst and Duh 193

 Learning Goals/Objectives

Unit   Learning objectives  

Jackson Plan  

【Knowledge】 1. To acquire a better understanding of the key developments of the Raffles’s arrival, its early settlers and Raffle’s town plan.

【Skills】 1. To explains the reasons for the founding of Singapore (1819). 2. To explain the importance of trade to Singapore. 3. To describe the contributions of key personalities and immigrants to the growth and development of Singapore.

【Values & Attitudes】 1. To develop an interest in the past. 2. To appreciate culture heritage as well as to instill a sense of courage, diligence and perseverance to Singapore.  

History Syllabus for Lower Secondary, Year of Implementation: 2006. ISBN 981-05-1669-X. Source: Curriculum Planning and Development Division, Ministry of Education, Singapore

Learning Content

Page 194: CHI 2013 DARE Course

Billinghurst and Duh 194

Consideration

 How can a new technology help new learning experience in cultural heritage?

 Interdisciplinary research (Design, Technology, Education and Learning)

 System building, a single application or  Recognition in each field  Real deployment in schools

194

Page 195: CHI 2013 DARE Course

Billinghurst and Duh 195

Theoretical Framework

“Situated cognition via scaffolding mechanisms

([Vygotsky, 1978])”

Distinct HAR technology pairings available in a game, (0=No, 1=Yes), resulting in four possible eHAR game types and play styles, each with an implementation process.

Y.-N. Chang, R. K. C. Koh, and H. B.-L. Duh, "Handheld AR games - A triarchic conceptual design framework," in Mixed and Augmented Reality - Arts, Media, and Humanities (ISMAR-AMH), 2011 IEEE International Symposium On, Basel, Switzerland, 2011, pp. 29-36.

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.

Page 196: CHI 2013 DARE Course

Billinghurst and Duh 196

 Triarchic conceptual design framework

•  GPS navigation: Location-based implementation for Cultural & Historical (contextual) explorations

•  Overlaying options: ‘Binoculars’ metaphor (i.e., Panoramic Map)

•  Virtual properties (game inventory)

•  Geo-tagging / (diary) •  Blended mini games

(i.e. puzzles) •  Tasks may exploit

platform’s hardware features

(GPS, Accelerometer)

•  Visual identification of past and present imagery

•  History comes to life by exploiting location-dependent contexts

•  Backend confirmation with server connectivity (‘Wizard of Oz’ possibility) for dynamic situational exchanges, i.e. messages, images, induce player-behaviors, etc

•  Promote Contextual Inquiry & Collaboration

(Learning Strategies)

Theoretical Framework

Page 197: CHI 2013 DARE Course

Billinghurst and Duh 197

The Jackson Plan

Textbook: SINGAPORE: FROM SETTLEMENT TO NATION - PRE-1819 TO 1971 (Marshall Cavendish Education) Theme: Chapter 3 - What Part Did the Different Immigrant Communities Play in Singapore’s Development?

The

Jack

son

Pla

n

Prior Knowledge:

The settlement of Singapore

-Why Raffles chose Singapore

Singapore’s central location

Central location

Excellent port

Good supply of drinking water

The Dutch had not occupied the island

Immigrants

Why immigrants came

Singapore’s town plan

Lieutenant Philip Jackson,1822

Improve the haphazard

building plan

Segregated population groups

Populations from trading goods /

countries

Chinese

Coolies

Samsui women

Indians

Labourers

Coolies

Malays Shipbuilders

Europeans Merchants

Arabs Traders

Page 198: CHI 2013 DARE Course

Billinghurst and Duh 198

•  Ideation: Use of Historical Illustrations / Images in Situated Augmented Views

•  Panoramic / Still - Visual Imageries of the Past + GPS

Page 199: CHI 2013 DARE Course

Billinghurst and Duh 199

Game Features, Mechanisms & Platform - Ideations

•  ‘Civic District Trail’ - A tourist’s DIY exploration experience promoted by the Singapore Tourism Board

Page 200: CHI 2013 DARE Course

Billinghurst and Duh 200

Virtual & physical interaction

Manipulate knowledge-collect trading materials

(i.e. spices)

Geo-Tagging the “right” locations

Taking pictures (Wizard of Oz) Blended casual

mini-games with physical interaction and collaboration

Game Features, Mechanisms & Platform

Page 201: CHI 2013 DARE Course

Billinghurst and Duh 201

Game Features, Mechanisms & Platform

Page 202: CHI 2013 DARE Course

Billinghurst and Duh 202

Chinese: Chinatown

Indians

Europeans &

Rich Asians

Malays & Muslims

“Plan of the town of Singapore” by Lieutenant Phillip Jackson,1822

Commercial Square

Game Features, Mechanisms & Platform

Page 203: CHI 2013 DARE Course

Billinghurst and Duh 203

Phase Learning Objectives Learning Task(s) Time

1 Understanding of activities

To understand the Gameplay and manipulation of iPad2 devices

- Introduction to Gameplay - Game introduction / Mission Briefing

15 min

2 Constructing Knowledge

To understand the background of Singapore settlement

- Information Collection: Know who these immigrants are

15 min

3 Mastering

To analyze how did the immigrants contribute to Singapore as a trading centre

- Experience the entrepot trade

20 min

4 Knowledge Application

To make comparisons and organize information of the different contributions of immigrants

- Make an accusation by evidence (Gain a summative feedback)

15 min

The Jackson Plan: Planned Gaming Activities

Page 204: CHI 2013 DARE Course

Billinghurst and Duh 204

The Trail

Page 205: CHI 2013 DARE Course

Billinghurst and Duh 205

Game Design

Page 206: CHI 2013 DARE Course

Billinghurst and Duh 206

Game Design

Page 207: CHI 2013 DARE Course

Billinghurst and Duh 207

The Jackson Plan - Features

Page 208: CHI 2013 DARE Course

Billinghurst and Duh 208

Evaluation

•  72 students (36 pairs) took part in the evaluation

•  Secondary One classes (~12-13 years old)

•  They were equally divided into 2 main groups, Location-based and Digital Book versions

Digital Book   Location-based AR  

Platform   Apple iPad2   Apple iPad2  Collaboration   Yes   Yes  Interaction Type   Non-AR   Location-based AR  

Play Space   Indoors   Outdoors  

Page 209: CHI 2013 DARE Course

Billinghurst and Duh 209

 The structure of knowledge

Evaluation

Page 210: CHI 2013 DARE Course

Billinghurst and Duh 210

The Jackson Plan

Page 211: CHI 2013 DARE Course

Billinghurst and Duh 211

The Jackson Plan

Page 212: CHI 2013 DARE Course

Billinghurst and Duh 212

The Jackson Plan

Page 213: CHI 2013 DARE Course

Billinghurst and Duh 213

The Jackson Plan

Page 214: CHI 2013 DARE Course

Billinghurst and Duh 214

The Jackson Plan

Page 215: CHI 2013 DARE Course

Billinghurst and Duh 215

The Jackson Plan

Page 216: CHI 2013 DARE Course

Theory into Practice: Domain-Centric Handheld Augmented Reality Game Design Study 3 - Co-creativity fusions in interdisciplinary AR game developments

Page 217: CHI 2013 DARE Course

Billinghurst and Duh 217 217

AR Research Directions

Page 218: CHI 2013 DARE Course

Billinghurst and Duh 218 The Vision of AR

Page 219: CHI 2013 DARE Course

Billinghurst and Duh 219

To Make the Vision Real..

 Hardware/software requirements • Contact lens displays •  Free space hand/body tracking •  Speech/gesture recognition •  Etc..

 Most importantly • Usability

Page 220: CHI 2013 DARE Course

Billinghurst and Duh 220

Natural Interaction

 Automatically detecting real environment •  Environmental awareness •  Physically based interaction

 Gesture Input •  Free-hand interaction

 Multimodal Input •  Speech and gesture interaction •  Implicit rather than Explicit interaction

Page 221: CHI 2013 DARE Course

Environmental Awareness

Page 222: CHI 2013 DARE Course

Billinghurst and Duh 222

AR MicroMachines

 AR experience with environment awareness and physically-based interaction •  Based on MS Kinect RGB-D sensor

 Augmented environment supports •  occlusion, shadows •  physically-based interaction between real and virtual objects

Page 223: CHI 2013 DARE Course

Billinghurst and Duh 223

Operating Environment

Page 224: CHI 2013 DARE Course

Billinghurst and Duh 224

Architecture

 Our framework uses five libraries:

•  OpenNI •  OpenCV •  OPIRA •  Bullet Physics •  OpenSceneGraph

Page 225: CHI 2013 DARE Course

Billinghurst and Duh 225

System Flow

 The system flow consists of three sections: •  Image Processing and Marker Tracking •  Physics Simulation •  Rendering

Page 226: CHI 2013 DARE Course

Billinghurst and Duh 226

Physics Simulation

 Create virtual mesh over real world

 Update at 10 fps – can move real objects

 Use by physics engine for collision detection (virtual/real)

 Use by OpenScenegraph for occlusion and shadows

Page 227: CHI 2013 DARE Course

Billinghurst and Duh 227

Rendering

Occlusion Shadows

Page 228: CHI 2013 DARE Course

Gesture Input

Page 229: CHI 2013 DARE Course

Billinghurst and Duh 229

Architecture

5. Gesture

• Static Gestures • Dynamic Gestures • Context based Gestures

4. Modeling

• Hand recognition/modeling • Rigid-body modeling

3. Classification/Tracking

2. Segmentation

1. Hardware Interface

HITLabNZ’s Gesture Library

Page 230: CHI 2013 DARE Course

Billinghurst and Duh 230

Architecture

5. Gesture

• Static Gestures • Dynamic Gestures • Context based Gestures

4. Modeling

• Hand recognition/modeling

• Rigid-body modeling

3. Classification/Tracking

2. Segmentation

1. Hardware Interface

HITLabNZ’s Gesture Library

o  Supports PCL, OpenNI, OpenCV, and Kinect SDK.

o  Provides access to depth, RGB, XYZRGB.

o  Usage: Capturing color image, depth image and concatenated point clouds from a single or multiple cameras

o  For example:

Kinect for Xbox 360

Kinect for Windows

Asus Xtion Pro Live

Page 231: CHI 2013 DARE Course

Billinghurst and Duh 231

Architecture 5. Gesture

• Static Gestures • Dynamic Gestures • Context based Gestures

4. Modeling

• Hand recognition/modeling

• Rigid-body modeling

3. Classification/Tracking

2. Segmentation

1. Hardware Interface

o  Segment images and point clouds based on color, depth and space.

o  Usage: Segmenting images or point clouds using color models, depth, or spatial properties such as location, shape and size.

o  For example:

HITLabNZ’s Gesture Library

Skin color segmentation

Depth threshold

Page 232: CHI 2013 DARE Course

Billinghurst and Duh 232

Architecture 5. Gesture

• Static Gestures • Dynamic Gestures • Context based Gestures

4. Modeling

• Hand recognition/modeling

• Rigid-body modeling

3. Classification/Tracking

2. Segmentation

1. Hardware Interface

o  Identify and track objects between frames based on XYZRGB.

o  Usage: Identifying current position/orientation of the tracked object in space.

o  For example:

HITLabNZ’s Gesture Library

Training set of hand poses, colors represent unique regions of the hand.

Raw output (without-cleaning) classified on real hand input (depth image).

Page 233: CHI 2013 DARE Course

Billinghurst and Duh 233

Architecture

5. Gesture

• Static Gestures • Dynamic Gestures • Context based Gestures

4. Modeling

• Hand recognition/modeling

• Rigid-body modeling

3. Classification/Tracking

2. Segmentation

1. Hardware Interface

o  Hand Recognition/Modeling   Skeleton based (for low resolution

approximation)   Model based (for more accurate

representation) o  Object Modeling (identification and tracking

rigid-body objects) o  Physical Modeling (physical interaction)

  Sphere Proxy   Model based   Mesh based

o  Usage: For general spatial interaction in AR/VR environment

HITLabNZ’s Gesture Library

Page 234: CHI 2013 DARE Course

Billinghurst and Duh 234

Method  Represent  models  as  collec1ons  of  spheres  moving  with  the  

models  in  the  Bullet  physics  engine  

Page 235: CHI 2013 DARE Course

Billinghurst and Duh 235

Method  Render  AR  scene  with  OpenSceneGraph,  using  depth  map  

for  occlusion  

Shadows  yet  to  be  implemented  

Page 236: CHI 2013 DARE Course

Billinghurst and Duh 236

Results

Page 237: CHI 2013 DARE Course

Billinghurst and Duh 237

Architecture

5. Gesture

• Static Gestures • Dynamic Gestures • Context based Gestures

4. Modeling

• Hand recognition/modeling

• Rigid-body modeling

3. Classification/Tracking

2. Segmentation

1. Hardware Interface

o  Static (hand pose recognition) o Dynamic (meaningful movement

recognition) o  Context-based gesture

recognition (gestures with context, e.g. pointing)

o  Usage: Issuing commands/anticipating user intention and high level interaction.

HITLabNZ’s Gesture Library

Page 238: CHI 2013 DARE Course

Multimodal Interaction

Page 239: CHI 2013 DARE Course

Billinghurst and Duh 239

Multimodal Interaction

 Combined speech input  Gesture and Speech complimentary

•  Speech – modal commands, quantities

•  Gesture –  selection, motion, qualities

 Previous work found multimodal interfaces intuitive for 2D/3D graphics interaction

Page 240: CHI 2013 DARE Course

Billinghurst and Duh 240

Free Hand Multimodal Input  Use free hand to interact with AR content  Recognize simple gestures  No marker tracking

Point Move Pick/Drop

Page 241: CHI 2013 DARE Course

Billinghurst and Duh 241

Multimodal Architecture

Page 242: CHI 2013 DARE Course

Billinghurst and Duh 242

Multimodal Fusion

Page 243: CHI 2013 DARE Course

Billinghurst and Duh 243

Hand Occlusion

Page 244: CHI 2013 DARE Course

Billinghurst and Duh 244

User Evaluation

 Change object shape, colour and position  Conditions

•  Speech only, gesture only, multimodal

 Measure •  performance time, error, subjective survey

Page 245: CHI 2013 DARE Course

Billinghurst and Duh 245

Experimental Setup

Change object shape and colour

Page 246: CHI 2013 DARE Course

Billinghurst and Duh 246

Results

 Average performance time (MMI, speech fastest) •  Gesture: 15.44s •  Speech: 12.38s •  Multimodal: 11.78s

 No difference in user errors  User subjective survey

•  Q1: How natural was it to manipulate the object? – MMI, speech significantly better

•  70% preferred MMI, 25% speech only, 5% gesture only

Page 247: CHI 2013 DARE Course

Future Directions

Page 248: CHI 2013 DARE Course

Billinghurst and Duh 248

Natural Gesture Interaction on Mobile

 Use mobile camera for hand tracking •  Fingertip detection

Page 249: CHI 2013 DARE Course

Billinghurst and Duh 249

Evaluation

 Gesture input more than twice as slow as touch  No difference in naturalness

Page 250: CHI 2013 DARE Course

Billinghurst and Duh 250

Intelligent Interfaces

 Most AR systems are stupid •  Don’t recognize user behaviour •  Don’t provide feedback •  Don’t adapt to user

 Especially important for training •  Scaffolded learning •  Moving beyond check-lists of actions

Page 251: CHI 2013 DARE Course

Billinghurst and Duh 251

Intelligent Interfaces

 AR interface + intelligent tutoring system •  ASPIRE constraint based system (from UC) •  Constraints

–  relevance cond., satisfaction cond., feedback

Page 252: CHI 2013 DARE Course

Billinghurst and Duh 252

Domain Ontology

Page 253: CHI 2013 DARE Course

Billinghurst and Duh 253

Intelligent Feedback

 Actively monitors user behaviour •  Implicit vs. explicit interaction

 Provides corrective feedback

Page 254: CHI 2013 DARE Course

Billinghurst and Duh 254

Page 255: CHI 2013 DARE Course

Billinghurst and Duh 255

Evaluation Results  16 subjects, with and without ITS   Improved task completion

  Improved learning

Page 256: CHI 2013 DARE Course

Billinghurst and Duh 256

Intelligent Agents

 AR characters •  Virtual embodiment of system •  Multimodal input/output

 Examples •  AR Lego, Welbo, etc •  Mr Virtuoso

–  AR character more real, more fun – On-screen 3D and AR similar in usefulness

Page 257: CHI 2013 DARE Course

Billinghurst and Duh 257

Contact Lens Display

 Babak Parviz •  University Washington

 MEMS components •  Transparent elements •  Micro-sensors

 Challenges •  Miniaturization •  Assembly •  Eye-safe

Page 258: CHI 2013 DARE Course

Billinghurst and Duh 258

Contact Lens Prototype

Page 259: CHI 2013 DARE Course

Billinghurst and Duh 259

Conclusion

Page 260: CHI 2013 DARE Course

Billinghurst and Duh 260

Conclusion

 There is need for better designed AR experiences  Through

•  use of Interaction Design principles •  understanding of the technology •  use of rapid prototyping tools •  rigorous user evaluation

 There a number of important areas for future research •  Natural interaction •  Multimodal interfaces •  Intelligent agents •  Novel displays

Page 261: CHI 2013 DARE Course

Billinghurst and Duh 261

More Information

•  Mark Billinghurst – [email protected]

•  Websites – www.hitlabnz.org

•  Henry Duh – [email protected]

Page 262: CHI 2013 DARE Course

Billinghurst and Duh 262 262

Resources

Page 263: CHI 2013 DARE Course

Billinghurst and Duh 263

Websites

 Meta List of AR SDKs •  http://www.icg.tugraz.at/Members/gerhard/augmented-reality-sdks

 ARToolKit Software Download •  http://artoolkit.sourceforge.net/

 ARToolKit Documentation •  http://www.hitl.washington.edu/artoolkit/

 ARToolKit Forum •  https://www.artoolworks.com/community/forum/

 ARToolworks Inc •  http://www.artoolworks.com/

Page 264: CHI 2013 DARE Course

Billinghurst and Duh 264

 ARToolKit Plus •  http://studierstube.icg.tu-graz.ac.at/handheld_ar/artoolkitplus.php

 osgART •  http://www.osgart.org/

 FLARToolKit •  http://www.libspark.org/wiki/saqoosha/FLARToolKit/

 FLARManager •  http://words.transmote.com/wp/flarmanager/

Page 265: CHI 2013 DARE Course
Page 266: CHI 2013 DARE Course

Billinghurst and Duh 266

AR Labs

 Europe •  TU Graz, Cambridge U, TU Munich, FraunhoferIGD

 USA •  Columbia U, Georgia Tech, USC

 Asia •  KIST, KAIST •  AIST, Kyoto U, NAIST, U of Tsukuba •  NUS, UniSA, HITLab NZ

 Companies •  Qualcomm, Nokia, Layar, Wikitube, Metaio

Page 267: CHI 2013 DARE Course

Billinghurst and Duh 267

Books

  Interactive Environments with Open-Source Software: 3D Walkthroughs and Augmented Reality for Architects with Blender 2.43, DART 3.0 and ARToolKit 2.72 by Wolfgang Höhl

 A Hitchhikers Guide to Virtual Reality by Karen McMenemy and Stuart Ferguson

 Bimber, Raskar. Spatial Augmented Reality (2005)

Page 268: CHI 2013 DARE Course

Billinghurst and Duh 268

 Books Mobile Interaction Design Matt Jones and Gary Marsden Designing for Small Screens Studio 7.5 Handheld Usability Scott Weiss Designing the Mobile User Experience Barbara Ballard

Page 269: CHI 2013 DARE Course

Billinghurst and Duh 269

Publication venues   Conference

•  IEEE/ACM International Symposium in Mixed and Augmented Reality (IEEE/ACM ISMAR) (ismar.net)

•  IEEE Virtual Reality (IEEE VR) •  Korean-Japan Mixed Reality Workshop (KJMR)

  Journal •  IEEE Transaction on Visualization and Computer Graphics (IEEE) •  Computer & Graphics (Elsevier) •  PRESENCE (MIT Press)

  Papers •  Zhou, F., Duh, H.B.L., and Billinghurst, M. (2008). Trends in Augmented Reality Tracking,

Interaction and Display: A Review of Ten Years of ISMAR. in IEEE International Symposium on Mixed and Augmented Reality (IEEE/ACM ISMAR) 193-202

•  Azuma, R., Baillot, Behringer, R., Feiner, S., Julier, S., MacIntyre, B., (2001). Recent Advances in Augmented Reality, IEEE Computer Graphics and Applications, 34-47

Page 270: CHI 2013 DARE Course

Billinghurst and Duh 270

More Papers

  E. Kruijff, J. E. Swan, and S. Feiner. Perceptual issues in augmented reality revisited. 9th IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2010, pp. 3--12.

  D. Drascic and P. Milgram. Perceptual issues in augmented reality. In M. T. Bolas, S. S. Fisher, and J. O. Merritt, editors, SPIE Volume 2653: Stereoscopic Displays and Virtual Reality Systems III, pages 123-134, January/February 1996.

270

Page 271: CHI 2013 DARE Course

Billinghurst and Duh 271

Developer Guidelines

 Palm http://www.access-company.com/developers/documents/docs/ui/

UI_Design.html  Zen of Palm guidelines http://www.access-company.com/developers/documents/docs/

zenofpalm.pdf

 Motorola http://developer.motorola.com/docstools/developerguides/

  iPhone Human Interface Guidelines http://developer.apple.com/documentation/iPhone/Conceptual/iPhoneHIG/

Page 272: CHI 2013 DARE Course

Billinghurst and Duh 272

Handheld HCI Design Websites

Do’s and Don’ts of PocketPC design http://www.pocketpcmag.com/_archives/Nov04/Commandements.aspx

Usability special interest group – handheld usability http://www.stcsig.org/usability/topics/handheld.html

Usable Mobile website http://www.smartgroups.com/groups/usablemobile

Mobile Coders Website http://www.mobilecoders.com/Articles/mc-01.asp

Univ of Waikato Handheld Group http://www.cs.waikato.ac.nz/hci/pdas.html

Mobile Interaction Website http://www.cs.waikato.ac.nz/~mattj/mwshop.html