UbiCom Book Slides 1 Ubiquitous computing: smart devices, environments and interaction Chapter 5 Human Computer Interaction Stefan Poslad http://www.eecs.qmul.ac.uk/people/ stefan/ubicom
Feb 02, 2016
UbiCom Book Slides
1Ubiquitous computing: smart devices, environments and interaction
Chapter 5
Human Computer Interaction
Stefan Poslad
http://www.eecs.qmul.ac.uk/people/stefan/ubicom
HCI: Overview
This part (a) first discusses:• What is Human Computer Interaction or Interfaces (HCI) and
why we need good HCI for human interactive systems?• What is a sub-type of HCI, implicit HCI (iHCI), how is it
differentiated from conventional explicit HCI (eHCI) and why do we need this to enhance pervasive computing?
• How to use eHCI in some common types of device?• How to use iHCI in (mobile and static) devices that are not
permanently attached to humans?• How to use iHCI in (mobile and static) devices that accompany
humans through being surface-mounted (wearable) or embedded (implants)
Ubiquitous computing: smart devices, environments and interaction 2
Chapter 5 Related Links
• iHCI is a type of context-awareness for the human environment (Chapter 7)
• Human behaviour models of intelligence (Chapter 8)• Social & other consequences of making devices more
human and more intelligent (Chapter 12)
Ubiquitous computing: smart devices, environments and interaction 3
HCI: OverviewThe slides for this chapter are also expanded and split into
several parts in the full pack
Part A: eHCI Use in some common smart device types
Part B iHCI for accompanied smart devices
Part C: iHCI for wearable & implanted smart devices
Part D: Human Centred Design
Part E: User Models and iHCI Design
Ubiquitous computing: smart devices, environments and interaction 4
HCI: Overview
• HCI, eHCI & iHCI • eHCI use in 4 Widely Used Devices• iHCI use in accompanied smart devices• iHCI use in wearable and implanted smart devices• Human Centred Design (HCD)• User Models: Acquisition & Representation• iHCI Design
Ubiquitous computing: smart devices, environments and interaction 5
6Ubiquitous computing: smart devices, environments and interaction
Mobile Devices
Handheld WIMPS
Usability
Basic Device eHCI
Input
User Modelling
Hidden iHCI
User Context
Games Console
Wearables & Implants
Telepresence
Smart Devices
iHCI with Devices Smart Services
Affective
HCD
Multi-modal
WIMPS
Organic
Touch-screen
Clothes
Tangible Gesture
Reflective
VR&AR
HUD
Neural Implants
Interaction Design
VRD
Auditory
Natural Language
Personalisation
(in)direct
iHCI
EyetapVideo Recorder
Output
Goals versus Situation
stereotypeSofttap multitap T9 etc
Design patterns
HCI: Introduction• Term HCI, widely used, since onset of Personal Computing
era in 1980s. • However groundwork for field of HCI started earlier, during
onset of the industrial revolution • Tasks became automated and powered-assisted• -> triggers an interest in studying human-machine
interaction • Some tasks require little human interaction during
operation, e.g., clothes-, dish- washing etc• Other tasks are very interactive, e.g., face washing,
playing the violin, etc
7Ubiquitous computing: smart devices, environments and interaction
H,C & IBasic concepts of HCI are:• Humans
• Computers / devices
• Interaction
8Ubiquitous computing: smart devices, environments and interaction
HCI: Motivation
• Machines (systems) aid human performance, but systems that interact poorly with humans will be a poor human aid.
• Need design models & process that are (user) interactive• The motivation for HCI is clear; to support more effective
use (Dix, 2004a) in three ways– Useful:– Usable:– Be used:
9Ubiquitous computing: smart devices, environments and interaction
HCI: Usability vs. Usefulness
• Success of a product depends largely on ?• • Summarised as Heckel's law and Heckel's inverse
law:– Heckel’s law:– Heckel’s inverse law:
• What this law expresses ?
Ubiquitous computing: smart devices, environments and interaction 10
Explicit HCI (eHCI)
• eHCI design: explicit interaction during a device’s normal operation.
• What are the Dominant eHCI UIs
Pure eHCI • Context-free
• Focus on H2C (Human-to-Computer) Interaction
Ubiquitous computing: smart devices, environments and interaction 11
eHCI versus Natural Interaction
• Natural interaction
• Natural interaction and familiarity and expertise
• Familiarity with use of tool is cultural and subjective
• Note also Natural Interaction linked to use of iHCI
Ubiquitous computing: smart devices, environments and interaction 12
iHCI
• Concept of implicit HCI (iHCI)• Proposed by Schmidt (2000)
– Defined as “an action, performed by the user that is not primarily aimed to interact with a computerized system but which such a system understands as input”.
• Our definition of iHCI bit different: – inputs with an implicit or implied context,
Ubiquitous computing: smart devices, environments and interaction 13
iHCI
• iHCI is more about C2H (Computer to Human) Interaction• iHCI assumes Chas a certain Model of H user• Model of H used as additional input• Need to share implicit context between human and system• Implicit interaction naturally supports hidden device design.
Ubiquitous computing: smart devices, environments and interaction 14
eHCI + iHCI or iHCI vs eHCI
• E.g.?? . • eHCI, usability design?
• Alternative iHCI design?
• Shift from eHCI design to also include iHCI design will be a key enabler for effective UbiCom systems
Ubiquitous computing: smart devices, environments and interaction 15
iHCI: Challenges
• Complex to accurately and reliably determine user context. Why?
Ubiquitous computing: smart devices, environments and interaction 16
Overview
• HCI, eHCI & iHCI • eHCI use in 4 Widely Used Devices • iHCI use in accompanied smart devices• iHCI use in wearable and implanted smart devices• Human Centred Design (HCD)• User Models: Acquisition & Representation• iHCI Design
Ubiquitous computing: smart devices, environments and interaction 17
How Device Interfaces & Interaction Varies
Devices can be characterized according to?:
Ubiquitous computing: smart devices, environments and interaction 18
UI and HCI Designs for 4 Common Devices
• PC• Mobile Phone• Games Console but many sub-types• TV / Projectors
• How does the UI and HCI design differ between these?
Ubiquitous computing: smart devices, environments and interaction 19
UI Type: Personal Computer Interface• ???
Ubiquitous computing: smart devices, environments and interaction 20
PC UI use in Mobiles
• Using a conventional PC UI approach won’t be optimum for mobile computing & ubiquitous computing - need a different approach, Why?
21Ubiquitous computing: smart devices, environments and interaction
UI Type: Mobile Device Interfaces
• PC / WIMPS models not so suitable for mobile (one handed) devices, Why not?
Ubiquitous computing: smart devices, environments and interaction 22
Mobile Device Interface: Limited I/P
How to support mobile user and small size of input?
23Ubiquitous computing: smart devices, environments and interaction
Mobile Device Interface: Limited O/P
How to overcome limited output?• Haptic interface use, e.g., vibration to signal incoming call• Maximising use of small screen: scrolling, switching screen• Peephole displays• Foldable displays• Filter information so receive and display less information,
e.g., using Personalisation(Chapter 7) Personal Agents (Chapter 8)
Ubiquitous computing: smart devices, environments and interaction 24
UI Type: Games Console Interfaces
• Games consoles: an important driver and can contribute to UbiCom in a number of ways.
• Computer games have often acted as an incubator for many innovations driving computing. How?
• Many different types of Games Console Interface
Ubiquitous computing: smart devices, environments and interaction 25
Games Console Interfaces: D-pad
Ubiquitous computing: smart devices, environments and interaction 26
• How does the D-pad controller work?
Games Console Interfaces: 3D Gesture-Based
Ubiquitous computing: smart devices, environments and interaction 27
• How does the 3D Gesture-Based controller work?
• Use of MEMS/ Sensors (Chapter 7)
• Use of gesture recognition (see later)
UI Type: Control (Panel) Interfaces
• Different Types of remote controllers depending on how remote the controller is:
• User approx. co-located with device being controlled
• User not co-located with device being Controlled
Ubiquitous computing: smart devices, environments and interaction 28
UI Type: Localised Remote Control Interfaces
Characteristics• Input controller and device separation
• Input device interfaces
• Wireless link between input control device and device
Ubiquitous computing: smart devices, environments and interaction 29
UI Type: Localised Remote Control Interfaces
• But profusion of remote control devices which have overlapping features
• Is it necessary to have a specialised controller per consumer device?
• Problems?
• How to solve this?
Ubiquitous computing: smart devices, environments and interaction 30
31Ubiquitous computing: smart devices, environments and interaction
Radio
17:38
1
GHI4
PQR7
ABC2
JKL5
STU8
DEF3
MNO6
WXYZ9
* 0 + #
Play channel
Record channel Channel 4
Radio TV DVD-W
Channel 5
Localised Remote Control Interface Design
• Instructors can add more detail about the discussion and design of universal controller here or delete this slide.
• (Section 5.2.5)
Ubiquitous computing: smart devices, environments and interaction 32
Overview
• HCI, eHCI & iHCI• eHCI use in 4 Widely Used Devices • iHCI use in accompanied smart devices • iHCI use in wearable and implanted smart devices• Human Centred Design (HCD)• User Models: Acquisition & Representation• iHCI Design
Ubiquitous computing: smart devices, environments and interaction 33
iHCI use in Accompanied Smart Devices: Topics
• Single vs. Multi-Modal Visual Interfaces• Gesture Interfaces• Reflective versus Active Displays• Combining Input and Output User Interfaces
– ???
• Auditory Interfaces• Natural Language Interfaces
Ubiquitous computing: smart devices, environments and interaction 34
Single vs. Multi-Modal Visual Interfaces
• Mode of human interaction uses human senses? Which
• Interactive ICT systems have modalities that mimic human senses. What?
Ubiquitous computing: smart devices, environments and interaction 35
Computer input & output modalities
36Ubiquitous computing: smart devices, environments and interaction
Computer Sensor
Computer
Output
1
1
HumanInterface
Single vs. Multi-Modal Visual Interfaces
Many interactive ICT systems use single visual mode of output interaction. Problems?
Solutions?
Ubiquitous computing: smart devices, environments and interaction 37
Multi-Modal Interaction Design: challenges
Integrating multiple modes is complex. Why?
Ubiquitous computing: smart devices, environments and interaction 38
Multi-Modal Interaction: Design
Two main approaches• Data for each modality can be processed separately, then
combined at the end.
• Data for each modality can be processed & combined concurrently
Ubiquitous computing: smart devices, environments and interaction 39
Gesture Interfaces
What are Gestures?• Expressive, meaningful body motions • Involving physical movements. Which?
• With the intent of conveying meaningful information about interacting with the environment.
Ubiquitous computing: smart devices, environments and interaction 40
Gesture Interfaces
• What are the main types of human gestures?
• How can gestures be sensed?
Ubiquitous computing: smart devices, environments and interaction 41
Gesture Interfaces: Classification
Gestures can also be classified into • 2D versus 3D
• Contactful versus Contactless–
• Directly sensed versus indirectly sensed•
Ubiquitous computing: smart devices, environments and interaction 42
Gesture Interfaces: Applications
• 1st basic contact based gesture interfaces?
• From the mid 2000s, contact less gestures being used in several types of games consoles, mobile phones, cameras, etc.
Ubiquitous computing: smart devices, environments and interaction 43
Gesture Interfaces: Applications
• ?????
Ubiquitous computing: smart devices, environments and interaction 44
Gesture: Rotate or flip handAction: Rotate or flip image
Gesture Interfaces: Applications
45Ubiquitous computing: smart devices, environments and interaction
Gesture: tilt display awayAction: Menu selection moves up
Gesture Interfaces: Applications
46Ubiquitous computing: smart devices, environments and interaction
Navigation Options
Find cinemasFind RestaurantsFind CafesFind NewsagentFind Bookshop
Tilt Navigation Options
Find cinemasFind RestaurantsFind CafesFind NewsagentFind Bookshop
Gesture: Two finger stretchAction: Stretch image
Gesture Interfaces: Applications
47Ubiquitous computing: smart devices, environments and interaction
Gesture Interfaces: HCI->HPI->HHI->HCI
48Ubiquitous computing: smart devices, environments and interaction
Human to virtual device interaction
Human to physical artefact interaction
Human to human physical interaction
Human to human physical interaction triggers machine to machine interactions
Gesture Design: Challenges
• ???.
Ubiquitous computing: smart devices, environments and interaction 49
Reflective vs Active Displays
• Which is more pervasive today and which will be more pervasive in the future: paper or active display devices?
• What are inherent characteristics of paper versus active displays and how do these effect their ability to become truly pervasive?
Ubiquitous computing: smart devices, environments and interaction 50
Reflective versus Active Displays
• Can we produce ICT displays that support more of the properties of physical paper?
• Display design mimics paper
• Epaper display design differs from actual paper
Ubiquitous computing: smart devices, environments and interaction 51
ElectroPhoretic Displays or EPDs
52Ubiquitous computing: smart devices, environments and interaction
- - +- ++
Clear Fluid
Micro-Capsule
Positively charged while particles
Negatively charged white particles
Transparent Electrode
Charged Electrode
Combining Input and Output User Interfaces
• UIs discussed so far, input devices are separated from the output devices
• State of the input is available as a visual cue only.• How can we combine / link input and output better?
Ubiquitous computing: smart devices, environments and interaction 53
Touchscreen
What are touchscreens?• Displays where position of contact with screen is detected• Via pointed physical objects such as pens, fingers, etc• Events can then be generated for an associated visual
object at that position and • Associated actions can then be triggered.
Ubiquitous computing: smart devices, environments and interaction 54
Touchscreen
• Touchscreen behaves as 2D, planar smart skin.• Wherever it is touched, a virtual object can be activated.• Types of touchscreens ?
– Resistive– Capacitive– Surface acoustic waves etc.
• Touch screen can behave as a:– soft control panel and user interface – that is reprogrammable– which can be customised to suit a range of applications and users
Ubiquitous computing: smart devices, environments and interaction 55
Touchscreen: Benefits
What are the benefits?
These characteristics make them ideal for many workplaces and public spaces.
Ubiquitous computing: smart devices, environments and interaction 56
Touchscreen: Applications• Touchscreens sed routinely in many applications & devices
– ??
• To ease use of pointing• To ease use of gestures• Single versus multiple finger gestures
Ubiquitous computing: smart devices, environments and interaction 57
Tangible User Interface (TUI)
• (TUI) is a UI that augments the real physical world by coupling digital information to everyday physical objects and environments.
• Tangible user interfaces are also referred to as– passive real-world props, – graspable user interfaces,– manipulative user interfaces – embodied user interfaces
Ubiquitous computing: smart devices, environments and interaction 58
Tangible User Interface (TUI)
How do Tangible Interfaces work?
• Attach micro sensors and actuators (Section 6.4) to physical objects
• Used as input devices to allow their manipulation to generate data streams in an output device or virtual view in a related virtual environment, (Section 6.2).
Ubiquitous computing: smart devices, environments and interaction 59
Tangible User Interface (TUI)
• Taxonomy of TUIs based upon embodiment and metaphors – –
• Four types of embodiment can be differentiated– Full embodiment e.g.,??
– Nearby embodiment e.g. ??
– Environmental embodiment e.g., ???
– Distant embodiment, e.g., ???
Ubiquitous computing: smart devices, environments and interaction 60
Tangible Bits Project
• Instructors can explain in more detail how this works or delete this slide
Ubiquitous computing: smart devices, environments and interaction 61
DataTiles Project
• Allows users to manipulate data in form of tangible “tiles”• Combinations of data streams and functions make it
possible to create new applications62Ubiquitous computing: smart devices, environments and interaction
DataTiles Project
• Instructors can explain in more detail how this works or delete this slide
Ubiquitous computing: smart devices, environments and interaction 63
Organic Interfaces
• Similar to Tangible Interfaces
• 3 characteristics which characterize organic UIs.
• Typically use Organic Light-Emitting Diode (OLED) type materials
Ubiquitous computing: smart devices, environments and interaction 64
Organic Interfaces
• Instructors can add more detail about this or delete this slide
Ubiquitous computing: smart devices, environments and interaction 65
Auditory Interfaces
What are the Benefits?
Design challenges?•
Ubiquitous computing: smart devices, environments and interaction 66
Auditory Interfaces: Non-Speech Based
2 basic auditory interfaces: • Speech based• Non-speech based
Non-speech auditory interfaces:• ?????
Ubiquitous computing: smart devices, environments and interaction 67
Auditory Interfaces: Speech Based
• ????.
Ubiquitous computing: smart devices, environments and interaction 68
Natural Language Interfaces
• Natural language interaction with machines can occur in a variety of forms. Which?
Ubiquitous computing: smart devices, environments and interaction 69
Natural Language Interfaces
• Generally, interaction can be more easily processed and understood if it defined using an expressive language that has a well-defined syntax or grammar and semantics– requires that users already know the syntax.
Benefits in using NL in HCI?
Ubiquitous computing: smart devices, environments and interaction 70
Natural Language Interfaces: Challenges
• What are the challenges in using NL Interfaces (NLI)?
Ubiquitous computing: smart devices, environments and interaction 71
Overview
• HCI, eHCI & iHCI• eHCI use in 4 Widely Used Devices • iHCI use in accompanied smart devices• iHCI use in wearable and implanted smart
devices • Human Centred Design (HCD)• User Models: Acquisition & Representation• iHCI Design
Ubiquitous computing: smart devices, environments and interaction 72
Hidden UI via Wearable and Implanted Devices
• In the Posthuman model, technology can be used to extend a person's normal conscious experience and sense of presence, across space and time.
There are 3 types of post-human technology: • Accompanied
– e.g. ???
• Wearable– e.g., ???
• Implants – E.g., ???
Ubiquitous computing: smart devices, environments and interaction 73
Wearable computers
• Wearable interfaces include a combination of ICT devices & modalities
• Wearable computers are especially useful when?
• Focus is on multi-modal interaction which includes visual interaction.
Ubiquitous computing: smart devices, environments and interaction 74
Wearable computers
• Visual modal systems are divided according to how humans interact with the system: – ??
• Visual interaction can be classified into– command– non-command interfaces.
• Non-command vision-based (human motion) analysis systems generally have four stages:
• motion segmentation
• object classification
• tracking
• interpretation.
Ubiquitous computing: smart devices, environments and interaction 75
Wearable Computer: WearComp and WearCam
• Many researchers contributed to the advancement of wearable computing
• Perhaps the most important Pioneer of Wearable Computing is Steve Mann –
• His 1st early main application focussed on recording personal visual memories that could be shared with other via the Internet.
Ubiquitous computing: smart devices, environments and interaction 76
Wearable Computer: WearComp and WearCam
Ubiquitous computing: smart devices, environments and interaction 77
Photo courtesy of Wikimedia Commons, http://en.wikipedia.org/wiki/Wearable_computing)
Wearable computing: Mann’s definition
Mann (1997): 3 criteria to define wearable computing. • Eudaemonic criterion
• Existential criterion
• Ephemeral criterion
Ubiquitous computing: smart devices, environments and interaction 78
Wearable computing: Types
• Some different type of wearable computers??
• N.B. Not all these meet Mann’s criteria
Ubiquitous computing: smart devices, environments and interaction 79
Head(s)-Up Display or HUD:
• presents data without blocking the user's view• pioneered for military aviation - now used in commercial
aviation and cars. • 2 types of HUD
– Fixed HUD:– Head-mounted HUD
Ubiquitous computing: smart devices, environments and interaction 80
EyeTap & Virtual Retinal Display
• Instructors can add more detail about these here or delete this slide.
Ubiquitous computing: smart devices, environments and interaction 81
Brain Computer Interface (BCI) or Brain Machine Interfaces (BMI)
• HCI focuses on indirect interfaces from human brain via human actuators–
• BCI are direct functional interfaces between brains and machines
• BCI represents ultimate natural interface
• Would you choose to make use of one when they become available in the future?
Ubiquitous computing: smart devices, environments and interaction 82
Brain Computer Interface (BCI) or Brain Machine Interfaces (BMI)
• Direct vs. Indirect coupling design choices ??–
– See also BANs in Chapter 11
• Brain versus nerve direct coupling design choices??
Ubiquitous computing: smart devices, environments and interaction 83
Computer Implants
• Opposite of wearing computers outside the body is to have them more directly interfaced to the body.
• Many people routinely use implants– ????
• Of specific interest is developing devices that can adapt to signals in the human nervous system.
• By connecting electronic circuitry directly to the human nervous system,– ???
Ubiquitous computing: smart devices, environments and interaction 84
Cyborg 2
Electrode array surgically implanted into Warwick’s left arm and interlinked into median nerve fibres is being monitored.
85Ubiquitous computing: smart devices, environments and interaction
BCI
• Instructors can add more detail about experiments here or delete this slide
Ubiquitous computing: smart devices, environments and interaction 86
PostHuman Model
• Use of alterative technology mediated realities
• A feeling of presence in the experience provides feedback to a person about the status of his or her activity.
• The subject perceives any variation in the feeling of presence and tunes its activity accordingly.
Ubiquitous computing: smart devices, environments and interaction 87
PostHuman Model and Reality
• People can experience alternative realities depending on:– the type of environment people are situated in– on their perception of the environment.
• Reality can be:– Technology mediated, e.g., ???– Chemically mediated, , e.g., ???– Psychologically mediated, , e.g., ???
Ubiquitous computing: smart devices, environments and interaction 88
Realities: VR, AR and MR
• (Revision of Section 1.2.3.3)
Ubiquitous computing: smart devices, environments and interaction 89
Virtual Reality (VR)
• VR seeks to immerse a physical user in a virtual 3D world• VR uses a computer simulation of a subset of the world
and immerses the user in it using UIs based upon:– ??
–
• VR seeks to enable humans to interact using a more natural interaction that humans use in the real world – ??
Ubiquitous computing: smart devices, environments and interaction 90
Augmented Reality (AR)
• Electronic images are projected over the real world so that images of the real and virtual world are combined.
• VR considered as a subset of AR?
• Early E.g. head-mounted display by Sutherland (1968). • Similar systems are in use today in types of military aircraft.
Ubiquitous computing: smart devices, environments and interaction 91
Telepresence & Telecontrol
• Telepresence allow a person in 1 local environment to:– ??
– .
• Telecontrol refers to the ability of a person in 1 place to– ???
Ubiquitous computing: smart devices, environments and interaction 92
Overview
• HCI, eHCI & iHCI• eHCI use in 4 Widely Used Devices • iHCI use in accompanied smart devices• iHCI use in wearable and implanted smart devices • Human Centred Design (HCD) • User Models: Acquisition and Representation• iHCI Design
Ubiquitous computing: smart devices, environments and interaction 93
Conventional Design versus HCD
Ubiquitous computing: smart devices, environments and interaction 94
Conventional Functional System Design
95Ubiquitous computing: smart devices, environments and interaction
Validate
Implement
Design
Requirements analysis
Final product
New product need
Human Centred Design (HCD)
• Focus on types of UbiCom System & environments:
• Need to make the type of user explicit: human users
• In contrast, automatic / autonomous systems
Ubiquitous computing: smart devices, environments and interaction 96
Human Centred Design (HCD)
ISO standard human centred design life-cycle involves 4 main sets of activities:
1. Define context of use
2. Specify stake-holder and organisational requirements
3. Multiple alternative (UI) designs need to be built.
4. Designs need to be validated against user requirements.
Ubiquitous computing: smart devices, environments and interaction 97
Human Centred Design (HCD)
98Ubiquitous computing: smart devices, environments and interaction
Understand &specify use context
Validate designs against requirements
Produce Design solutions
identify need for interactive design
Systemsatisfies requirements
Identify stakeholder & Organisationalrequirements
Human-centered design
A Fuller Range of System & User Requirements / Use Contexts
• HCD System & User requirements • -> Wider requirements than back-end functional
requirements• HCD Methodologies are a powerful way to get the
wide range of environment requirements & use contexts for UbiCom systems
• What is the fuller ranges of UbiCom / HCD requirements?
Ubiquitous computing: smart devices, environments and interaction 99
A Fuller Range of System & User Requirements / Use Contexts
• System– ???
• Physical Environment– ???
• Users– Types– Task & goals– User interface
• Social – ???
• Usability & User experience: – Usability:– User experiences
Ubiquitous computing: smart devices, environments and interaction 100
HCD: Use Context / Requirements
101Ubiquitous computing: smart devices, environments and interaction
Human
UsabilitySocial
Physical
Physical operating context: dark versus light conditions etc.
ICT
UbiComp System
ICT
Virtual
ICT
Functional <.- -> TasksNon-functional <.- -> System Spec.
External: Storage, QoS, network. etc Virtual
Environment Requirements
User
Physical
Human
Internal: Storage, display. etc
HCD: Usability as a User Requirement
• Usability is defined as ??
–
• Usability is not a single, one-dimensional property of a user interface.
• Usability is a combination of factors. ISO-940-11 explicitly mentions no. of factors– ???
• These usability factors can often be expanded further sub-properties,
Ubiquitous computing: smart devices, environments and interaction 102
HCD: Stake-Holders
• End-user is obvious stake-holder in HCD Design• Who are the other stake-holders in the personal memory
scenario?• Are there additional stake-holder requirements?
Ubiquitous computing: smart devices, environments and interaction 103
HCD: Acquiring User Context/User Requirements
Several dimensions for get user requirements during HCD life-cycle
• In Controlled conditions (Lab) vs. in the field • Direct user involvement (e.g., interview,
questionnaire) vs. indirect (e.g., observations)
• Individual users vs. user groups, HCI / domain experts vs. predictive user models (no users)
104Ubiquitous computing: smart devices, environments and interaction
HCD: Methods to Acquire User Requirements
Which Methods?• ?????
Analysis of data gathered depends on:• Amount of time, level of detail, uncertainty etc• Knowledge the analysis requires
105Ubiquitous computing: smart devices, environments and interaction
Usability Requirements & Use Contexts Examples
• For each of the scenarios in chapter 1, e.g., the personal video memories, define the use context and usability requirements.
Ubiquitous computing: smart devices, environments and interaction 106
HCI / HCD versus User Context Awareness
• Are these the same or similar concepts?
See User context awareness (Chapter 7)
See HCI / HCD (Chapter 5)
Ubiquitous computing: smart devices, environments and interaction 107
HCD: System Model for Users vs. Users’ Model of System
• What model of the system does it project to the user?
• What model does the user have of the system?
• What if these models differ?
Ubiquitous computing: smart devices, environments and interaction 108
HCD Design: Conceptual Models & Mental Models
• Amazing number of everyday things & objects – ????
• Very challenging for people to learn to operate and understand many devices of varying degrees of complexity if the interaction with each of them is unique.
• Complexity of interacting with new machines cm be reduced. How?
Ubiquitous computing: smart devices, environments and interaction 109
HCD Design: Conceptual Models
• Discuss some example conceptual models
Ubiquitous computing: smart devices, environments and interaction 110
HCD Design: Affordances
• Complexity of interacting with new systems is reduced if:– they have parts that provide strong clues on how to
operate themselves.
• These are referred to as affordances• What are examples of physical UI affordances?
Ubiquitous computing: smart devices, environments and interaction 111
HCD Design: Virtual Affordances
• Many analogue physical objects being replaced by virtual computer UIs
• Virtual UI affordances are being increasing important.• How to design virtual UI affordances?• Can link virtual objects or widgets in it to related & familiar
physical world objects• Challenges in linking widgets to familiar physical objects?
Ubiquitous computing: smart devices, environments and interaction 112
HCD: Multiple Prototype Designs
• Example: Consider PVM Scenario (Chapter 1)• What type of design?• Is there only 1 type of design for recording / playing /
transmitting multimedia?– ?
• Consider the requirements:
113Ubiquitous computing: smart devices, environments and interaction
HCD: Evaluation
• Summative versus Formative Evaluation• Summative
– Conventional – To verify Design
• Formative– HCD
Ubiquitous computing: smart devices, environments and interaction 114
HCD: System Evaluation Methods
• Can use similar techniques to gathering user requirements.
Ubiquitous computing: smart devices, environments and interaction 115
Overview
• HCI, eHCI & iHCI• eHCI use in 4 Widely Used Devices • iHCI use in accompanied smart devices• iHCI use in wearable and implanted smart devices • Human Centred Design (HCD)• User Models: Acquisition & Representation • iHCI Design
Ubiquitous computing: smart devices, environments and interaction 116
User Modelling: Design Choices
• Implicit vs. explicit models• User instance (Individual) modelling versus user
(stereo)type modelling• Static versus dynamic user models• Generic versus application specific models• Content-based versus collaborative user models
Ubiquitous computing: smart devices, environments and interaction 117
User Modelling Design: Implicit vs. Explicit models
• Systems can either use– Explicit feedback– Implicit feedback
• Often these can be combined. How?
• Some specific techniques for acquiring a user model are described in more detail elsewhere (Section 5).
• Hybrid user models may also be used.
Ubiquitous computing: smart devices, environments and interaction 118
Indirect User Input and Modelling
Benefits?
Methods?• See Previous Slides• Accuracy & precision?• Handing inaccuracy & imprecision
Ubiquitous computing: smart devices, environments and interaction 119
Direct User Input and Modelling
Benefits versus Challenges?
User requirements & user model built using:• Single-shot versus Multi-shot user input• Static versus Dynamic input
Also need to consider user model maintenance
Ubiquitous computing: smart devices, environments and interaction 120
User Stereotypes
• Challenge in bootstrapping user model / behaviour leads to use of group behaviour
• Stereotype: infers user model from small number of facts using a larger set of facts from a group user model.
• Used by collaborative type user model, e.g., recommender systems
• Challenges?
Ubiquitous computing: smart devices, environments and interaction 121
Modelling Users’ Planned Tasks and Goals
• Users often interact purposely with a system in a task-driven way, to achieve a particular goal.
• Several ways to analyse and model user tasks:– Hierarchical Task Analysis or HTA – Etc
• Consider each scenario in Chapter 1, e.g., PVM scenario, give a user task / goal model (next slide)
Ubiquitous computing: smart devices, environments and interaction 122
HCD: Functional Requirements
123Ubiquitous computing: smart devices, environments and interaction
0: Record a physical world scene
1: Switch on camera
2: Set camera task mode
3.2: View Scene
4: Configure camera shot of scene
5: Compose Scene
6: Record Scene
8: Switch off camera
4.1: set zoom
3: Select Scene
3.1: Move towards Scene
3.3: Fix Scene
4.2: set lightingcorrection
4.2.1: set Flash
4.2.2: set under exposure
4.2.3: set over exposure
Plan 0: Do 1..2; Repeat 3..7 until no more recordings needed or no more power then do 8
7: Check recording
Plan 1: Repeat 3.1, 3.2 until satisfied, do 3.3
Plan 2: Repeat 4.1 until satisfied, do 4.2
Plan 3: do 4.2.1 or 4.2.2 or 4.2.3
For Photographer in PM scenario (Chapter 1)
Multiple User Tasks and Activity Based Computing
• Use tasks as part of activities that require access to services across multiple devices,
• Devices can be used by different types of people• Users are engaged in multiple concurrent activities• Users are engaged in activities which may occur across
multiple physical environments,• Activities may be shared between participants • Activities on occasion need to be suspended and resumed.• (See Chapter 12)
Ubiquitous computing: smart devices, environments and interaction 124
Situation Action versus Planned Action Models
• 2 basic approaches to task design
• Planned actions: – ????
• Situated action:– ???
Ubiquitous computing: smart devices, environments and interaction 125
Models of Human Users: HCI vs. AI
• Field of HCI proposes models of humans that focus on supporting high-level usability criteria and heuristics – Focus is less on explicit computation models of how humans think
and act.
• Field of AI proposes models of humans that make explicit computation models to simulate how humans think, act and interact– (Chapters 8 and 9)
Ubiquitous computing: smart devices, environments and interaction 126
Overview
• HCI, eHCI & iHCI• eHCI use in 4 Widely Used Devices • iHCI use in accompanied smart devices• iHCI use in wearable and implanted smart devices • Human Centred Design (HCD)• User Models: Acquisition & Representation• iHCI Design
Ubiquitous computing: smart devices, environments and interaction 127
iHCI
• iHCI Model Characteristics• User Context Awareness• Intuitive and Customised Interaction• Personalisation• Affective Computing• iHCI Design Heuristics and Patterns
Ubiquitous computing: smart devices, environments and interaction 128
Types of User Model
• Several related terms & kinds of user model are differentiated
• User Models• Personal Profiles• User contexts• Application / User requirements• System Models• Mental Models• Conceptual Models
Ubiquitous computing: smart devices, environments and interaction 129
User Context Awareness
• User context aware can be exploited to beneficially lessen the degree of explicit HCI needed.
• User context-awareness is a sub-type of general context-awareness (Chapter 7)
User context-awareness can include:• Social environment context• Users’ physical characteristics and capabilities for HCI• User presence in a locality or detected activity• User identity (Section 12.3.4).• User planned tasks and goals (Section 5.6.4).• Users’ situated tasks (Sections 5.6.5, 5.6.6).• User emotional state (Section 5.7.5)
Ubiquitous computing: smart devices, environments and interaction 130
Intuitive and Customised Interaction
Are current computer systems dominated by MTOS based devices & use of desktop UI metaphor intuitive?
• E.g., ??• E.g., ??• E.g., etc
Ubiquitous computing: smart devices, environments and interaction 131
Intuitive and Customised Interaction
Moran & Zhai propose 7 principles to evolve desktop model into more intuitive model for UbiCom
• From Office Container to Personal Information Cloud• From desktop to a diverse set of visual representations• From Interaction with 1 device to interaction with many• From Mouse & Keyboard to Interactions & modalities• Functions may move from Applications to Services• From Personal to Interpersonal to Group to Social• From low-level tasks to higher level activities
Ubiquitous computing: smart devices, environments and interaction 132
Personalisation
• Personalisation: tailoring applications & services specifically to an individual’s needs, interests, preferences
• Adaptation of consumer product, electronic or written medium, based on person profile
• Applications of personalisation– targeted marketing– product & service customisation including information filtering – CRM
Ubiquitous computing: smart devices, environments and interaction 133
Personalisation: Benefits
• ???
Ubiquitous computing: smart devices, environments and interaction 134
Personalisation: Challenges (Cons)
• ???
Ubiquitous computing: smart devices, environments and interaction 135
Personalisation
• Personalisation: a more complete model of user-context that is more reusable and persists: – ????
• 2 key issues:– design of model so that it can be distributed and shared– dynamic vs. static task-driven user preference contexts
Ubiquitous computing: smart devices, environments and interaction 136
Personalisation: Mechanisms
• Instructors can add more slides about how personalisation mechanisms, e.g., recommender systems, work here or delete this slides
Ubiquitous computing: smart devices, environments and interaction 137
Affective Computing: Interactions using Users’ Emotional Context
• Affective computing: computing relates to, arises from, or influences emotions.
• Applications include: – ???
• Design challenges for affective computing with those for: – determining the user context – developing more complex human-like intelligence models
Ubiquitous computing: smart devices, environments and interaction 138
Affective Computing
Pickard (2003) identified six design challenges: • Range & modalities of emotion expression is broad• People’s expression of emotion is idiosyncratic & variable• Cognitive models for human emotions are incomplete• Sine qua non of emotion expression is the physical body
but computers not embodied in the same way • Emotions are ultimately personal and private• No need to contaminate purely logical computers with
emotional reactiveness
Ubiquitous computing: smart devices, environments and interaction 139
iHCI: Design Heuristics and Patterns
• Many different higher-level HCI design usability / user experience criteria have been proposed by different HCI designers to promote good design of HCI interaction.
• Many different HCI heuristics (rules of thumb derived from experience) have proposed to support HCI criteria
• Specific guidance is needed to engineer UIs to comply with these usability & user experience HCI principles.
• UI design patterns can support HCI usability principles and then be mapped into lower-level more concrete design patterns
Ubiquitous computing: smart devices, environments and interaction 140
iHCI: Design Heuristics and Patterns
Example iHCI patterns include:
Ubiquitous computing: smart devices, environments and interaction 141
iHCI: Design Patterns & Heuristics
• Instructors can propose many more examples here or delete this slide.
Ubiquitous computing: smart devices, environments and interaction 142
iHCI: Engineering iHCI Design Patterns
• Can propose simplify design models along 2 dimensions that are interlinked– Organisation / structural models versus time-driven interaction
models– Front-end / Presentation (UI) interaction versus back-end system
actions that support this interaction
• Need to organise UI widgets or objects at UI• Need to organise and link presentation to actions• Need to design interaction with these widgets• (see next slide as an example)
Ubiquitous computing: smart devices, environments and interaction 143
Image SearchiHCI: Engineering iHCI Design Patterns
144Ubiquitous computing: smart devices, environments and interaction
MVC Pattern Toolkit
Model
View
Query String
Object Model Interaction Model
Query
Get-Input Query
Back-end Front-end (UI)
UI Presentation Model
Desktop
Search App
Get image
Locate Access
UI Task Model
TextfieldButton
Clear entry points Pattern
Button
StartEnter description here Advanced Search
Control
Defined in GUI Toolkit API
GUI Event handlers
Overview
• HCI, eHCI & iHCI • eHCI use in 4 Widely Used Devices • iHCI use in accompanied smart devices • iHCI use in wearable and implanted smart
devices • Human Centred Design (HCD) • User Models: Acquisition & Representation • iHCI Design
Ubiquitous computing: smart devices, environments and interaction 145
Summary
• A human centred design process for interactive systems specifies four principles of design: the active involvement of users and a clear understanding of user and task requirements; an appropriate allocation of function between users and technology based upon the relative competence of the technology and humans; iteration is inevitable because designers hardly ever get it right the first time; a multi-disciplinary approach to the design.
• Human centred design life-cycle involves user participation throughout four main sets of activities: defining user tasks and the (physical, ICT) environment context; defining user and organisational requirements; iterative design prototyping and validation against the requirements.
Ubiquitous computing: smart devices, environments and interaction 146
Summary
• To enable humans to effectively interact with devices to perform tasks and to support human activities, systems need to be designed to support good models of user interfaces and processes of human computer interaction.
• Users can be modelled directly and indirectly. User task models can be modelled as task plans or as situated actions. iHCI design concerns three additional concerns: support for natural (human computer) interaction; user models including models of emotions which can be used to anticipate user behaviour and user context awareness including personalisation.
• Some design patterns and heuristics oriented towards iHCI are described.
Ubiquitous computing: smart devices, environments and interaction 147
Summary & Revision
For each chapter• See book web-site for chapter summaries, references,
resources etc.• Identify new terms & concepts• Apply new terms and concepts: define, use in old and
new situations & problems• Debate problems, challenges and solutions• See Chapter exercises on web-site
148Ubiquitous computing: smart devices, environments and interaction
Exercises: Define New Concepts
• Touchscreen, etc
Ubiquitous computing: smart devices, environments and interaction 149
Exercise: Applying New Concepts• What is the difference between touchscreen and a normal display?
Ubiquitous computing: smart devices, environments and interaction 150