Page 1
1
V11: User Interaction in AmI
Dr.-Ing. Reiner WichertFraunhofer-Institut für Graphische Datenverarbeitung IGD
Holger Graf
Fraunhofer-Institut für Graphische Datenverarbeitung IGD
Mohammad-Reza (Saied) Tazari
Fraunhofer-Institut für Graphische Datenverarbeitung IGD
Ambient IntelligenceWS 10/11
Page 2
2
Gliederung
Abgrenzung des Themas
Eine Analyse
UI Beschreibungssprachen
Beispielmodelle & -ansätze
Das UI Framework von PERSONA
Page 3
3
WIEDERHOLUNG AUS VORLESUNG #1 + ABGRENZUNG DES THEMAS
User Interaction in Smart Environments
Page 4
44
Methodischer Ansatz
Änderung der Interaktion
1. Benutzerziel
2. Strategie
3. Ausführung1. Benutzerziel
2. Strategie
3. Ausführung
Page 5
55
(2) Software-Architekturzur Koordination vonGeräten und Applikationen.
(1) Natürlichsprache Interaktion wird möglich.Multimodale Interaktion wird möglich.
(3) Interaktionsmodelle,Mentale Modelle,Inferenz-Mechanismen,Induktives Schließen.
Ambient Intelligence bündelt alle Bereiche der Informatik
(4) Geräte / Sensoren
Page 6
6
“Heller!”
Vision
Quelle: EMBASSI
Page 7
7
Implizite versus Explizite Interaktion (I)
• Implizit: Beobachtung der Aktionen des Benutzers, auch
wenn nicht im Sinne direkter Interaktion mit der Umgebung
• Erfassung der Handlungen und Teilung im System als
kontextuelle Ereignisse
• Analyse der sich ergebenden Situationen und Ableitung
möglicher Benutzerziele
• Ableitung möglicher Reaktionen durch die Umgebung
Teilaspekt von Vorlesungen #4 & #7
Nicht Gegenstand dieser Vorlesung
Page 8
8
Implizite versus Explizite Interaktion (II)
• Explizit: Benutzer interagiert bewusst mit der Umgebung
• Entweder im Sinne direkter Anweisung, z.B.
• Benutzer: „Fenster im Schlafzimmer schließen!“
• Oder auf Rückfragen der Umgebung reagieren, z.B.
• System: „zu viel Rauch in der Küche; soll das Fenster
geöffnet werden?“
• Benutzer: „Ja!“
Gegenstand dieser Vorlesung
Page 9
9
Explicit UI over I/O channels long enough in the shadow of “implicit
interaction” over sensing channels in AmI
Progresses that help explicit UI become more important
proliferation of (multi-)touch sensing, HD displays, & displays embedded in all
possible devices
new interaction forms supported by special devices (e.g. WiiMote)
qualitative progresses in
▫ speech recognition
▫ natural language processing
▫ gesture recognition (e.g., Kinect)
socio-political pressure on “accessibility for all”
The Importance of Explicit User Interaction
Page 10
10
EINE ANALYSEUser Interaction in Smart Environments
Page 11
11
The Pipe-Lines Model von Nigay & Coutaz (1997)
Qu
elle: citeseerx.ist.psu
.edu
/viewd
oc/su
mm
ary?d
oi=
10.1.1.144.7554
Page 12
12
Intelligente Umgebungen mit vernetzten Geräten
Page 13
13
Multiplizität in Intelligenten Umgebungen
Quelle: Dissertation von Marco Blumendorfhttp://opus.kobv.de/tuberlin/volltexte/2009/2325/pdf/blumendorf_marco.pdf
Page 14
14
living room TV
sleeping room TV
a display in the entrance
a display integrated in the fridge door
mirrors capable of becoming displays
microphone arrays installed in all rooms
loudspeakers installed in all rooms
phones providing displays, microphones, (loud)speakers
hi-fi providing loudspeakers
.
.
An infrastructure of available I/O channels
I/O Devices in emerging Smart Homes
Page 15
15
Smart Environments as Open Distributed Systems
Page 16
16
The Consequence
I/OInfrastructure
I/OInfrastructure
OpenDistributed
System
OpenDistributed
System
Page 17
17
Begriffe
1. Nach Blumendorf
▫ Quelle:
opus.kobv.de/tuberlin/volltexte/2009/2325/pdf/blumendorf
_marco.pdf
2. Nach Tazari
▫ Quelle:
www.gris.tu-darmstadt.de/teaching/courses/ws1011/ambi
ent/slides/PERSONA_Architektur_Manual.pdf
Page 18
18
Begriffe nach Blumendorf
1. Interaction Resource (IR)
Atomic (one-way, single-modality) I/O channel exploitable by
a user for executing a task. E.g., keyboards, mice, screens,
speakers, microphones, or cameras
2. Interaction Device (ID)
Computing systems that handle the input of or send output
to individual IRs connected to it. Hence, an ID is a
collection of IRs together with the computing unit. It
comprises the hardware used for the interaction (e.g.
screen, keyboard, touch-pad) as well as a software
platform for communication and presentation tasks.
Page 19
19
Begriffe nach Tazari
Channel: Smart environments need to bridge between the physical world and the
virtual realm with the help of certain devices. Channel denotes the bridging
passage provided by such devices between the physical world and the virtual
realm. Depending on the kind of channel opened, a channel might be called a
sensing channel (provided by sensors), an acting channel (provided by actuators),
an input channel (provided by microphones, keyboards, etc.), or an output channel
(provided by displays, loudspeakers, etc.). The latter two types of channels might
be referred to as I/O Channels.
I/O Device: An abbreviation for input and / or output device. A device that provides
an input and / or output channel for facilitating explicit interaction between a smart
environment and its human users. Input devices, such as a microphone, a
keyboard, or a mouse, can capture an instruction or response that is provided by a
human user and represent it in terms of data in the virtual realm. Upon receive of
data within the virtual realm that is intended to be presented to human users,
output devices, such as displays and loudspeakers, can make it perceivable to the
addressed humans.
Page 20
20
Recall Concept Maps from V3
Page 21
21
Anforderungen
1. Nach Blumendorf
▫ Quelle:
opus.kobv.de/tuberlin/volltexte/2009/2325/pdf/blumendorf
_marco.pdf
2. Nach Tazari
▫ Quelle: www.springerlink.com/content/5l3685543k4v8524/
Page 22
22
Anforderungen nach Blumendorf
• Shapeability to address different layouts for users, device
capabilities and usage contexts,
• distribution across multiple interaction devices,
• multimodality to support various input and output modalities,
• shareability between multiple users,
• mergability and interoperability of different applications.
Im Grunde alles verschiedene Aspekte der Adaptibilität
Page 23
23
Adaptibilitätsanforderungen nach Blumendorf: Shapeability
Layout change depending onuser distance to the screen
Page 24
24
Adaptibilitätsanforderungen nach Blumendorf: Distribution
user interface can be distributed across multipleinteraction devices and kept continuously synchronized
Page 25
25
Adaptibilitätsanforderungen nach Blumendorf: Multimodality
user is able to utilize multiple interaction resources and modalities including voice, touch and gesture simultaneously
Page 26
26
Adaptibilitätsanforderungen nach Blumendorf: Shareability
Two users sharing applications
Page 27
27
Adaptibilitätsanforderungen nach Blumendorf: Mergability
UI of a cooking assistant embedded in a meta-UI controlling interaction parameters
Page 28
28
Anforderungen nach Tazari
Separating I/O channel management from applications
Modality- / layout-neutral dialogs
Brokerage mechanisms
Support for adaptive dialogs
Task division between layers
Availability of user context, capabilities, and preferences to all layers
Handling input & output
Modality fusion & fission
Context-free input
Page 29
29
UI BESCHREIBUNGS-SPRACHEN
User Interaction in Smart Environments
Page 30
30
Need for Declarative Languages
• A direct consequence of separating application layer from the
presentation layer
e.g., www.google.come.g., Firefox e.g.,language = HTMLprotocol = HTTP
Page 31
31
The problem with HTML
• Not really modality-neutral
• Sometimes posing certain layout
More abstract and neutral languages investigated since more than 10
years:
UIML
TERESA XML
UsiXML
SMIL
EMMA
XISL
XForms
Page 32
32
XForms - Separation of Values from Controls
There are two parts to the essence of XForms. The first is to separate
what is being returned from how the values are filled in:
• The model specifies the values being
collected (the instance), and their related
logic
• Types, restrictions
• Initial values, Relations between
values
• The body of the document then binds
forms controls to values in the instance
Quelle: www.w3.org/2006/Talks/05-26-steven-XForms/
Page 33
33
XForms – Intent-based Controls
Quelle: www.w3.org/2006/Talks/05-26-steven-XForms/
Page 34
34
BEISPIEL-MODELLE& -ANSÄTZE
User Interaction in Smart Environments
Page 35
35
The W3C Multimodal interaction Framework - Overview
Quelle (auch für die nächsten 2 Folien): www.w3.org/TR/mmi-framework/
Page 36
36
The W3C Multimodal interaction Framework – Input Side
Page 37
37
The W3C Multimodal interaction Framework – Output Side
Page 38
38
Ansatz von Sottet et al.
Quelle: http://www.springerlink.com/content/t441q8wk3n48307p/
Page 39
39
Eine Laufzeitarchitektur nach Clerckx et alb
oo
ks.go
og
le.de/b
oo
ks?id
=W
ktQJS
BK
Y50C
&p
g=
PA
339&lp
g=
PA
339
Page 40
40
Die MASP Architektur nach Blumendorfo
pu
s.kob
v.de/tu
berlin
/volltexte/2009/2325/p
df/b
lum
end
orf_m
arco.p
df
Page 41
41
DAS UI FRAMEWORKVON PERSONA
User Interaction in Smart Environments
Page 42
42
Dialog Descriptions
Goal: Modality- & Layout-neutral
PERSONA solution inspired by XForms
Apparently the most advanced form-based solution
Separating the form UI description from the form data
Define a “dialog package” based on XForms UI controls
Use own RDF-based data model instead of adding a new complexity
Page 43
43
The Dialog Package
Page 44
44
Cornerstone: I/O Buses
Capabilities of the I/O handlers
appropriateness for certain access impairments
supported languages and modalities
locations where output can be presented
modality-specific tuning capabilities
Dialog ID
The Brokerage / Adaptation
Page 45
45
Parameters provided by the app
Content language & privacy level Addressed user
Parameters added by the UI
Framework
the presentation location and modality
access impairments to be considered
modality-specific recommendations
Supporting the Output Bus in Adaptation
Situation Reasoner
Context History
Entrepôt
Facts / Rules
SPARQLEngine
Dialog Manager
output bus
handle
publish update
fetch
Profiling
applicationsapplicationsapplicationsapplications
I/O handlersI/O handlersI/O handlersI/O handlers
Page 46
46
Coherent representation of the whole system
Management of Dialogs
▫ Per user & priority-based management of dialog queues
▫ Suspending dialogs and continuing later
Providing the system main menu
Handling context-free input
More on the Dialog Manager
Page 47
47
Application 2
Input
Input Bus
Input Handler Input Handler Input Handler Input Handler
Application 1
Speech Recognition
Gesture Recognition
Input Handler
Subscribe Subscribe
Publish
Page 48
48
Input Fusion
Switch On !
Input Bus
Application 1
Input HandlerTV set Switch On
Switch On TV set
Speech Recognition
Gesture Recognition Fusion
Page 49
49
Application 2
Output
Output Bus
Output Handler Output Handler Output Handler Output Handler
Application 1
Text 2 Speech
Output Handler
Subscribe
Publish
Page 50
50
Context Awareness
Output Bus
Application 1
Take your ProzacTM !
Text 2 Speech
Privacy Awareness
Page 51
51
Context Awareness: Dynamic
Output Bus
Application 1
Page 52
52
Input Bus
shared base
«interface»Bus
AbstractBusBusStrategy
«interface»InputtBus
InputBusImplInputBusStrategy
implements
implements
Page 53
53
Input Bus Members
#InputPublisher(BundleContext)+publish(InputEvent)
InputPublisher#InputSubscriber(BundleContext)#addNewRegParams(String)+dialogAborted(String)+handleInputEvent(InputEvent)
InputSubscriber
shared base
«interface»BusMember
«interface»Publisher
«interface»Subscriber
Abstract
Abstract
Page 54
54
Functional Model of Input Events
#InputEvent(PResource, Location, String)#InputEvent(PResource, Location, Submit)+getDialogID() : String+getInputLocation() : Location+getInputSentence() : String+getParentDialogURI() : String+getSubmissionID() : String+getSubmittedData() : PResource+getUser() : PResource+getUserInput(String[]) : Object+hasDialogInput() : boolean+isServiceSearch() : boolean+isSubdialogCall() : boolean+isSubdialogSubmission() : boolean
InputEvent
Page 55
55
Output Bus
shared base
«interface»Bus
AbstractBusBusStrategy
«interface»OutputtBus
OutputBusImplOutputBusStrategy
implements
implements
«uses» «uses»
«use
s»
+checkNewDialog(in oe : OutputEvent) : boolean+dialogFinished(in dialogID : String)+getSuspendedDialog(in dialogID : String) : OutputEvent+suspendDialog(in dialogID : String)
«interface»DialogManager
Page 56
56
Output Bus Members
#OutputPublisher(BundleContext)+abortDialog(String)+adaptationParametersChanged(OutputEvent, String)+dialogSuspended(String)+publish(OutputEvent)+resumeDialog(String, PResource)
OutputPublisher#OutputSubscriber(BundleContext, OutputEventPattern)+adaptationParametersChanged(String, String, Object)#addNewRegParams(OutputEventPattern)+cutDialog(String) : PResource+dialogAborted(String)+dialogFinished(Submit, boolean)+handleOutputEvent(OutputEvent)#removeMatchingRegParams(OutputEventPattern)
OutputSubscriber
shared base
«interface»BusMember
«interface»Publisher
«interface»Subscriber
Abstract
Abstract
Abstract
Page 57
57
Output Event Properties & their Providers
addressedUser (app)
contentPrivacyLevel (app)
channelPrivacyLevel (DM)
dialogForm (app)
dialogPriority (app)
hasAccessImpairment (DM)
outputLanguage (app)
outputModality (DM)
altOutputModality (DM)
presentationLocation (DM)
Beispiele von modalitätsspezifische Parameter (DM)
screenResolutionMaxX screenResolutionMaxY
screenResolutionMinX screenResolutionMinY
voiceGender voiceLevel
Page 58
58
Modelling of Access Impairments
impairmentLevel: {none, low, middle, high, full}
AccessImpairment
HearingImpairment SightImpairment
ColorBlindness NearSightedness FarSightedness Astigmatism LightSensitivity
PhysicalImpairmentSpeakingImpairment
Page 59
59
Danke für die Aufmerksamkeit
&
bis zur nächsten Vorlesung