Exploring Tangible User Interfaces For Ableton Live Martin Crowley 17157366 M.Sc. Interactive Media University of Limerick Supervisor: Dr Nicholas Ward Submitted to the University of Limerick, September, 2018
Exploring Tangible User Interfaces For
Ableton Live
Martin Crowley
17157366
M.Sc. Interactive Media
University of Limerick
Supervisor: Dr Nicholas Ward
Submitted to the University of Limerick, September, 2018
ii
Declaration
Supervisor: Dr Nicholas Ward
This thesis is presented in partial fulfilment of the requirements for the degree of a Master of
Science in Interactive Media. It is entirely my own work and has not been submitted to any
other university or higher education institution, or for any other academic award in this
university. Where use has been made of the work of other people it has been fully
acknowledged and fully referenced.
Signature
___________________________
Martin Crowley
Date
___________________________
iii
Acknowledgements
I would like to thank a number of individuals without whom this project would not have been
possible. Firstly, I would like to thank my supervisor, Dr Nicholas Ward for his patience and
guidance throughout. I would also to thank several members of staff in CSIS whose continual
feedback was of enormous benefit to this project, namely Dr Cristiano Storni, Dr Mikael
Fernström, Dr Gabriela Avram and Ed Devane.
I must also extend a large thank you to my former mentor Dr Brian Carty for introducing me
to music focused computer programming.
Finally, I would like to thank my classmates from whom I learned so much during this past
year.
iv
Dedication
I would like to dedicate this work to my mother, Frances for her continued support,
encouragement and for generally hounding me to finally finish college. Better late than never!
Also a special dedication to the memory of my father, William.
v
Abstract
This thesis documents research into a live performance focused tangible user interface (TUI)
for Ableton Live. This is motivated by shortcomings in traditional MIDI controllers, which
inhibit expressive movement and obscure performance gestures from the audience. Theoretical
and empirical research form the basis for a sustained attack on the possibilities for tangible
interaction with Ableton through a ‘sketching in hardware’ design methodology. Subsequent
user evaluation reveals that a TUI approach to controlling digital audio workstations (DAWs)
such as Ableton both liberates bodily movement and enhances audience spectacle. This
demonstrates considerable potential for future research in the area of embodied interaction with
DAWs, while it also contributes more broadly to knowledge on the application of TUIs in
music performance.
vi
Table Of Contents
EXPLORING TANGIBLE USER INTERFACES FOR ..................................................... I
ABLETON LIVE ..................................................................................................................... I
DECLARATION..................................................................................................................... II
ACKNOWLEDGEMENTS ................................................................................................. III
DEDICATION....................................................................................................................... IV
ABSTRACT ........................................................................................................................... V
LIST OF TABLES ............................................................................................................... XII
LIST OF FIGURES ........................................................................................................... XIII
1. CHAPTER 1: INTRODUCTION ................................................................................. 1
1.1. THE BODY AND ELECTRONIC MUSIC .......................................................................... 2
1.2. THE SPECTACLE OF LIVE ELECTRONIC MUSIC PERFORMANCE ................................... 3
1.3. THE EMERGENCE OF TANGIBLE USER INTERFACES FOR MUSIC ................................. 4
1.4. PRIMARY CONTRIBUTION ........................................................................................... 4
1.5. ROADMAP .................................................................................................................. 6
2. CHAPTER 2: CONTROL OVER ABLETON LIVE IN CONTEXT ...................... 1
2.1. ABLETON FUNCTIONALITY ......................................................................................... 1
2.1.1. Arrangement View ................................................................................................. 1
2.1.2. Session View........................................................................................................... 2
2.2. REVIEWING ABLETON LIVE CONTROLLERS ................................................................ 3
2.2.1. The Gold Standard: Generic MIDI Controllers .................................................... 3
2.2.2. The Emergence of Ableton Specific Controllers: Novation Launchpad ................ 4
2.2.3. Departure from the GUI: Ableton Push Controller ............................................... 5
2.2.4. Identifying a Gap for a TUI System ....................................................................... 5
3. CHAPTER 3: BACKGROUND RESEARCH ............................................................ 7
3.1. INTRODUCTION ........................................................................................................... 7
vii
3.2. TUI RESEARCH .......................................................................................................... 7
3.2.1. Foundations of Tangible User Interfaces .............................................................. 7
3.2.2. TUIs: From Practical Use to Music ...................................................................... 9
3.3. ADVANTAGES & DISADVANTAGES OF TUIS FOR MUSIC PERFORMANCE ................. 11
3.3.1. Advantages ........................................................................................................... 11
3.3.1.1. Confounding the Privileging of Cognition over Action ............................................................................. 11
3.3.1.2. TUIs for the Enhancement of Audience Spectacle .................................................................................... 12
3.3.1.3. TUIs Make the World Itself Your Interface ............................................................................................... 12
3.3.2. Limitations of TUIs .............................................................................................. 14
3.3.2.1. Inability to Save State ................................................................................................................................. 14
3.3.2.2. Subjectivity ................................................................................................................................................. 14
3.4. MUSIC TUIS - TECHNOLOGICAL CONSIDERATIONS .................................................. 15
3.5. RELATED PROJECTS.................................................................................................. 16
3.5.1.1. Skål ............................................................................................................................................................. 16
3.5.1.2. Collaborative: IDEO's C60 Music Player ................................................................................................... 17
3.5.1.3. Musical Trinkets ......................................................................................................................................... 18
3.5.1.4. Music Blocks .............................................................................................................................................. 19
3.6. EMPIRICAL RESEARCH ............................................................................................. 19
3.6.1. Interviews ............................................................................................................. 19
3.6.1.1. Extreme Use Case #1 .................................................................................................................................. 20
3.6.1.1.1. Case #1 Summary ................................................................................................................................ 20
3.6.1.1.2. Case #1 Analysis:................................................................................................................................. 21
3.6.1.2. Extreme use case #2 .................................................................................................................................... 21
3.6.1.2.1. Case #2 Summary ................................................................................................................................ 21
3.6.1.2.2. Case #2 Analysis:................................................................................................................................. 22
3.6.1.3. Overall Analysis of Interviews ................................................................................................................... 22
3.6.2. Ableton User Survey ............................................................................................ 23
3.6.2.1. Survey Findings .......................................................................................................................................... 23
3.6.2.2. Survey Limitations ...................................................................................................................................... 24
3.7. BACKGROUND CONCLUSION .................................................................................... 24
4. CHAPTER 4: METHODOLOGY.............................................................................. 26
4.1. SKETCHING IN HARDWARE ....................................................................................... 26
4.2. APPROACH ............................................................................................................... 27
4.3. DATA COLLECTION METHODS ................................................................................. 27
4.3.1. Detailing Chosen Data Collection Methods ........................................................ 29
4.3.1.1. Online Survey ............................................................................................................................................. 29
viii
4.3.1.2. Telephone Interviews .................................................................................................................................. 30
4.3.1.3. Crowdsourcing ............................................................................................................................................ 31
4.3.1.4. Informal Demonstrations ............................................................................................................................ 31
4.3.1.5. Group Evaluation ........................................................................................................................................ 32
4.3.1.6. Expert Reviews ........................................................................................................................................... 33
4.4. DATA ANALYSIS ...................................................................................................... 34
4.4.1. Discourse Analysis ............................................................................................... 34
4.4.2. Video Analysis ...................................................................................................... 34
4.5. RELIABILITY, VALIDITY, GENERALIZABILITY AND LIMITATIONS ............................. 35
5. CHAPTER 5: EXPLORATORY DESIGN ............................................................... 36
5.1. INTRODUCTION ......................................................................................................... 36
5.2. IDEATION ................................................................................................................. 36
5.2.1. Criteria to Guide Design ..................................................................................... 36
5.2.2. Brainstorming & Bodystorming ........................................................................... 37
5.2.3. Mind Mapping ...................................................................................................... 38
5.2.4. Sketching .............................................................................................................. 39
5.2.5. Storyboarding ...................................................................................................... 39
5.3. PARALLEL PROTOTYPING STAGE.............................................................................. 40
5.3.1. Introduction.......................................................................................................... 41
5.3.2. Parallel Prototype #1: Exploring Tech Focused Design ..................................... 41
5.3.2.1. Motivation ................................................................................................................................................... 42
5.3.2.2. Description .................................................................................................................................................. 42
5.3.2.2.1. Constructing the Tangibles .................................................................................................................. 43
5.3.2.2.2. Proof of Concept: Max MSP Communication ..................................................................................... 44
5.3.2.2.3. Potential use with Ableton Live (Via Max MSP) ................................................................................ 45
5.3.2.2.4. Technical Implementation of Sketch ................................................................................................... 46
5.3.2.3. Evaluation ................................................................................................................................................... 47
5.3.2.4. Conclusion .................................................................................................................................................. 47
5.3.3. Playful Task Focused Design............................................................................... 48
5.3.3.1. Parallel Prototype #2: Bringing Tempo Under Tangible Control............................................................... 48
5.3.3.1.1. Motivation ............................................................................................................................................ 48
5.3.3.1.2. Description ........................................................................................................................................... 49
5.3.3.1.2.1. Evaluation ..................................................................................................................................... 50
5.3.3.1.2.2. Conclusion .................................................................................................................................... 50
5.3.3.2. Parallel Prototype #3: Sample Selection by Weight ................................................................................... 51
ix
5.3.3.2.1. Motivation ............................................................................................................................................ 51
5.3.3.2.2. Description ........................................................................................................................................... 52
5.3.3.2.3. Evaluation ............................................................................................................................................ 54
5.3.3.2.4. Conclusion ........................................................................................................................................... 55
5.3.3.3. Parallel Prototype #4: Tangible Ableton Presets ........................................................................................ 55
5.3.3.3.1. Motivation ............................................................................................................................................ 55
5.3.3.3.2. Description ........................................................................................................................................... 56
5.3.3.3.3. Evaluation ............................................................................................................................................ 58
5.3.3.3.4. Conclusion ........................................................................................................................................... 59
5.3.3.4. Conclusion of Task-Specific Design Phase ................................................................................................ 59
5.4. EXPLORATORY DESIGN: ANALYSIS & CONCLUSION ................................................ 59
6. CHAPTER 6: ITERATIVE DESIGN ........................................................................ 61
6.1. INTRODUCTION ......................................................................................................... 61
6.2. TASK-FOCUSED DESIGN ........................................................................................... 62
6.2.1. Iterative Prototype #1: Tangible Clip Selection .................................................. 63
6.2.1.1. Motivation ................................................................................................................................................... 63
6.2.1.2. Description .................................................................................................................................................. 63
6.2.1.2.1. Concept ................................................................................................................................................ 63
6.2.1.2.2. Technical Implementation ................................................................................................................... 64
6.2.1.3. Tentative Evaluation ................................................................................................................................... 65
6.2.2. Iterative Prototype #2: Tangible Sound Selection ............................................... 66
6.2.2.1. Introduction ................................................................................................................................................. 66
6.2.2.2. Description .................................................................................................................................................. 67
6.2.2.2.1. Associating Sounds with Objects ........................................................................................................ 67
6.2.2.3. Tentative Evaluation ................................................................................................................................... 69
6.2.3. User Evaluation of Polyrhythmic Dice and Tangible Sound Selection Prototypes
70
6.2.3.1. Method and Directive ................................................................................................................................. 70
6.2.3.2. Required Data ............................................................................................................................................. 72
6.2.3.3. Findings ...................................................................................................................................................... 72
6.2.3.3.1. Personal Observations.......................................................................................................................... 73
6.3. CONCLUSION OF TASK FOCUSED DESIGN ................................................................. 73
6.4. PERFORMANCE FOCUSED DESIGN ............................................................................ 73
6.4.1. Introduction.......................................................................................................... 73
x
6.4.2. Performance Prototype #1: Tangible Live Set..................................................... 74
6.4.2.1. Motivation ................................................................................................................................................... 74
6.4.2.2. Description .................................................................................................................................................. 75
6.4.2.2.1. Tangible Mixing .................................................................................................................................. 75
6.4.2.2.2. Multi-Sample Player Per Track ........................................................................................................... 76
6.4.2.2.3. Variation in Dice Control .................................................................................................................... 77
6.4.3. Performance Prototype #2: Tangible Djing ....................................................... 78
6.4.3.1. Motivation ................................................................................................................................................... 78
6.4.3.2. Description .................................................................................................................................................. 78
6.4.3.3. Tentative Evaluation ................................................................................................................................... 79
6.4.4. Evaluation of Performance Focused Prototypes ................................................. 80
6.4.4.1. Candidates ................................................................................................................................................... 80
6.4.4.2. Technical Setup........................................................................................................................................... 81
6.4.4.3. Directive...................................................................................................................................................... 81
6.4.4.4. Discussion of Results .................................................................................................................................. 83
6.4.4.4.1. Tangible Aspects: Embodiment ........................................................................................................... 83
6.4.4.4.2. Expressiveness and responsiveness ..................................................................................................... 83
6.4.4.4.3. Changing Samples with Objects. ......................................................................................................... 84
6.4.4.4.4. Dice ...................................................................................................................................................... 84
6.4.4.4.5. Reliance on the Screen ......................................................................................................................... 84
6.4.4.4.6. Technical Shortcomings ...................................................................................................................... 84
6.4.4.4.7. User Impressions of Overall concept ................................................................................................... 85
6.4.5. Design the Final System....................................................................................... 85
6.4.5.1. Review Feedback From Use Testing .......................................................................................................... 85
6.4.5.2. Exploring a Wireless Solution ................................................................................................................... 86
6.4.5.3. Design using 3D Modelling ........................................................................................................................ 88
6.4.5.4. Fabrication Of Final System ....................................................................................................................... 89
6.4.5.4.1. Constructing Mobile Enclosures .......................................................................................................... 89
6.4.5.4.2. Constructing Stationary Dice Enclosure .............................................................................................. 90
6.4.5.4.3. Aesthetic Considerations ..................................................................................................................... 91
6.4.5.5. Finalised System for Display at D.A.W.N. Exhibition ............................................................................... 92
6.4.5.5.1. Final Evaluation ................................................................................................................................... 94
6.4.5.5.1.1. Audience Spectacle Related Findings .......................................................................................... 94
6.4.5.5.1.2. Performance Related Findings ..................................................................................................... 95
6.4.5.5.1.3. Target Demographic Reconsidered .............................................................................................. 96
7. CHAPTER 7: CONCLUSION & FUTURE WORK ................................................ 98
xi
7.1. DISCUSSION .............................................................................................................. 98
7.1.1. Outline of Project ................................................................................................. 98
7.1.2. Discussion of Findings ......................................................................................... 98
7.2. CONTRIBUTION......................................................................................................... 99
7.3. FUTURE WORK ....................................................................................................... 100
7.3.1. The Attachment of Meaning ............................................................................... 100
7.3.2. Group Collaboration ......................................................................................... 101
8. BIBLIOGRAPHY ...................................................................................................... 102
9. APPENDICES ............................................................................................................ 112
9.1. APPENDIX A: SURVEY ............................................................................................ 112
9.1.1. Questions and Responses Outlined .................................................................... 112
9.1.2. Survey Posts on Social Media ............................................................................ 114
9.2. APPENDIX B: RELEVANT CODE .............................................................................. 115
9.2.1. Code for Early Prototypes ................................................................................. 115
9.2.1.1. iPad Tangibles........................................................................................................................................... 115
9.2.1.1.1. iPad Tangibles Max Patch ................................................................................................................. 115
9.2.1.1.2. iPad Tangibles Openframeworks Code ............................................................................................. 115
9.2.1.2. Tangible Metronome................................................................................................................................. 116
9.2.1.2.1. Tangible Metronome Arduino Code .................................................................................................. 116
9.2.1.2.2. Tangible Metronome Max Patch ....................................................................................................... 117
9.2.1.3. Sample Selection by Weight ..................................................................................................................... 118
9.2.1.3.1. Sample Selection by Weight Arduino Code ...................................................................................... 118
9.2.1.3.2. Sample Selection by Weight Max Patch............................................................................................ 119
9.2.1.4. NFC Preset Changer ................................................................................................................................. 120
9.2.1.4.1. NFC Preset Changer Arduino Code................................................................................................... 120
9.2.1.4.2. NFC Preset Changer Max Patch ........................................................................................................ 121
9.2.1.5. Tangible Clip Section with Dice Arduino Code ....................................................................................... 122
9.2.1.6. Tangible Sound Selection Arduino ........................................................................................................... 124
9.2.2. Code For Final Prototype: Mobile Enclosures ................................................. 126
9.2.3. Code For Final Prototype: 4x Dice Enclosure .................................................. 131
xii
List of Tables
TABLE 1: PRIMARY FINDINGS OF BACKGROUND RESEARCH ...................................................... 24
TABLE 2: DATA COLLECTION STRATEGIES FOR RESEARCH PHASE ............................................. 28
TABLE 3: DATA COLLECTION STRATEGIES FOR SKETCHING IN HARDWARE PHASE ..................... 28
TABLE 4: FEEDBACK FROM USER TESTING CONSIDERED............................................................ 86
xiii
List of Figures
2.1: SCREENSHOT OF THE ARRANGEMENT VIEW IN ABLETON LIVE ............................................ 2
2.2: SCREENSHOT OF THE SESSION VIEW IN ABLETON LIVE ....................................................... 3
2.3 THE AUTHOR’S KORG NANOKONTROL 2 GENERIC MIDI CONTROLLER ................................. 4
2.4: NOVATION LAUNCHPAD (NOVATION 2018) ........................................................................ 4
2.5: THE LCD GUI OF THE ABLETON PUSH (ABLETON 2018) .................................................... 5
2.6: GRAPH PLOTTING EXTENT OF BODILY ENGAGEMENT AND SPECTACLE ENABLED BY
STANDARD CONTROLLERS .................................................................................................... 6
3.1: BACKGROUND PHASE OF RESEARCH ..................................................................................... 7
3.2: THE MCRPD MODEL (ULLMER AND ISHII 2000) .................................................................. 9
3.3: REACTABLE (REACTABLE 2018) ........................................................................................ 10
3.4: SKÅL (CREATIVE APPLICATIONS NETWORK 2009) ............................................................ 17
3.5: C60 MUSIC PLAYER (DESIGNBOOM 2010) ......................................................................... 18
3.6: MUSICAL TRINKETS (PARADISO ET AL. 2000) .................................................................... 18
3.7: MUSIC BLOCKS (FATBRAINTOYS 2018) ............................................................................. 19
3.8: PRE-GIG PHOTO DEPICTING THE DUO'S LAPTOP BENEATH THE TABLE (PHOTO COURTESY OF
THE ARTISTS) ...................................................................................................................... 20
3.9: INTERVIEWEE PERFORMING WITH CONTROLLER TILTED TOWARDS THE AUDIENCE (PHOTO
COURTESY OF THE ARTIST) ................................................................................................. 21
5.1: PHASE 1 OF DESIGN............................................................................................................ 36
5.2: BRAINSTORMING SETUP: USING TRADITIONAL CONTROLS AS A BASIS FOR IDEATION ......... 37
5.3: BRAINSTORMING DOCUMENTATION ................................................................................... 38
5.4: BODYSTORMING WITH VARIOUS GRASPABLE SHAPES ......................................................... 38
5.5: MINDMAP ........................................................................................................................... 39
5.6: STORYBOARD DEPICTING THE POTENTIAL ATTACHMENT OF SOUNDS TO TANGIBLE OBJECTS
........................................................................................................................................... 40
5.7: STORYBOARD DEPICTING TWO SEPARATE PERFORMANCE SCENARIOS: CLOSED LAPTOP VS
THE OPEN PLAN TUI ........................................................................................................... 40
5.8: PARALLEL AND ITERATIVE PHASES .................................................................................... 41
5.9: APP IS CODED ON MACBOOK AND TRANSFERRED OVER WIFI TO BE HOSTED ON IPAD ....... 43
5.10: THE PRINTOUT THAT WAS SUPPLIED AS A GUIDE FOR TOUCH POINT PLACEMENT .............. 43
xiv
5.11: PHASES OF CONSTRUCTION OF SMALLER CARDBOARD TANGIBLES ................................... 44
5.12: LARGER TANGIBLES WERE MORE GRASPABLE AND STABLE .............................................. 44
5.13: IMAGE OF THE APP INTERACTING WITH MAX (HOSTED ON THE COMPUTER). THE BLUE
TANGIBLE CONTROLS THE PITCH OF THE NOTE OCCURRING ON EACH STEP. TURNING THE
TANGIBLE CLOCKWISE INCREASES THE PITCH. .................................................................... 45
5.14: SKETCH DEPICTING HOW TWO TANGIBLES CAN CONTROL TWO SEPARATE ABLETON EFFECTS
........................................................................................................................................... 46
5.15: TIN CAN AND LEGO CATHEDRAL TANGIBLES (WITH FIDUCIAL MARKERS ATTACHED) ON
IPAD SCREEN ...................................................................................................................... 46
5.16: CONTROL FLOW FROM IPAD TO MAX MSP (LEFT) AND MAX MSP PATCH (RIGHT).......... 47
5.17: SKETCH DEPICTING A BRAINSTORMING TANGIBLE CONTROL OVER ABLETON. (THE
METRONOME IS CIRCLED) ................................................................................................... 49
5.18: METRONOME: HALL SENSOR ATTACHED TO THE BODY AND MAGNET ATTACHED TO THE
PENDULUM ......................................................................................................................... 50
5.19: SKETCH DEPICTING USE OF WEIGHTS TO CONTROL ABLETON ........................................... 51
5.20: LOAD CELL, AMPLIFIER AND ARDUINO ............................................................................ 52
5.21: 500ML BOTTLE OF WATER USED TO CALIBRATE THE LOAD CELL ...................................... 52
5.22: TWO TAPE ROLLS USED AS WEIGHTS ................................................................................ 53
5.23: SETTING SEPARATE SELECTION RANGES FOR BOTH KICKS. THE RANGE FROM 0-20 (CIRCLED)
IS WHERE NO SAMPLE WOULD BE TRIGGERED WHEN THERE IS NO WEIGHT APPLIED TO THE
LOAD CELL ......................................................................................................................... 53
5.24: 100G WEIGHT TRIGGERING THE LIGHTER KICK DRUM AND 500G TRIGGERING THE HEAVIER
ONE .................................................................................................................................... 54
5.25: USER EVALUATES ABLETON WEIGHT CONTROLLER ......................................................... 55
5.26: SKETCH EXPLORING PRESET SELECTION BASED ON TOUCH AND SIZE................................ 56
5.27: PN532 NFC MODULE ....................................................................................................... 57
5.28: DEPICTION OF INFORMATION FLOW IN OVERLY CONVOLUTED SETUP ............................... 57
5.29: TANGIBLE OBJECTS ARE STORED SYMBOLICALLY ON A SHELF TO DEMONSTRATE A REAL
WORLD ALTERNATE TO DIGITALLY STORING PRESETS IN FOLDERS. .................................... 58
5.30: PLACING A CAN ATOP THE PN532 SWITCHES TO THE ‘SINGING CAN’ ABLETON REVERB
PRESET ............................................................................................................................... 58
5.31: ANNOTATED PORTFOLIO OF EXPLORATORY PROTOTYPING PHASE ................................... 60
6.1: MIND MAP WITH THREE KEY ABLETON LIVE PERFORMANCE TASKS CIRCLED ..................... 61
xv
6.2: ARDUINO PRO MICRO AND RC522 READER WITH ASSORTED RFID TAGS (FOBS AND CARDS)
........................................................................................................................................... 62
6.3: DEPICTION OF HOW MIDI TRANSMITTED OVER SERIAL NEGATES THE NEED FOR MAX MSP
........................................................................................................................................... 62
6.4: PLAYING WITH DICE TO GET A FEEL FOR THEIR CAPABILITIES (LEFT) AND SKETCHING IDEAS
(RIGHT)............................................................................................................................... 63
6.5: RFID TAGS ATTACHED TO INDIVIDUAL SIDES OF THE DICE ................................................ 64
6.6: EACH DIE WOULD TRIGGER ONE OF THREE CLIPS ON IT ASSIGNED TRACK, DEPENDING ON
WHAT SIDE WAS BEING READ .............................................................................................. 64
6.7: TWO RFID READERS WERE REQUIRED TO READ EACH DICE. EACH PAIRED READER AND DICE
WAS GIVEN A SEPARATE TRACK TO ACT UPON WITHIN ABLETON. ....................................... 65
6.8: FACEBOOK VIDEO POST #1 WITH SOME EXAMPLE FEEDBACK VIA COMMENTS .................... 66
6.9: SKETCH THAT EXPLORES ASSIGNING SAMPLES TO ASSOCIATED OBJECTS ........................... 67
6.10: RECORDING THE STRIKING OF EACH OBJECT WITH A SPOON USING A PORTABLE RECORDER
........................................................................................................................................... 67
6.11: ARDUINO CODE SHOWING UNIQUE MIDI CC VALUES BEING TRANSMITTED FOR EACH OF THE
THREE RFID CARDS ........................................................................................................... 68
6.12: MOD WHEEL MAPPED TO SAMPLE SELECTION (LEFT) AND MIDI TRIGGER RANGE FOR EACH
SAMPLE SET (RIGHT) ........................................................................................................... 68
6.13: ATTACHING RFID CARDS TO BASE OF OBJECTS (IN THE ABSENCE OF STICKERS) .............. 69
6.14: SWAPPING THE OBJECTS IN AND OUT OVER THE SENSOR TRIGGERS THE DIFFERENT SOUNDS
........................................................................................................................................... 69
6.15: FACEBOOK VIDEO POST #2 WITH FEEDBACK COMMENTS .................................................. 70
6.16: SAMPLE TRIGGERING OBJECTS (CUP, GLASS & TIN) CIRCLED RED AND POLYRHYTHMIC DICE
CIRCLED BLUE .................................................................................................................... 71
6.17: USER A INTERACTING WITH THE TANGIBLE SAMPLE SELECTION CONTROLLER ................ 71
6.18: USERS B, C AND D INTERACT AS A GROUP WITH THE SYSTEM PAYING LITTLE ATTENTION TO
LAPTOP SCREEN .................................................................................................................. 72
6.19: CIRCUIT WITH PING SENSOR NOW ADDED ....................................................................... 75
6.20: CALIBRATING THE DISTANCE SENSOR USING A RULER ...................................................... 75
6.21: DISTANCE DICTATES SOUND VOLUME OF EACH ABLETON TRACK .................................... 76
6.22: TANGIBLE TOY INSTRUMENTS WITH AFFIXED RFID TAGS ARE USED TO TRIGGER SOUNDS
........................................................................................................................................... 76
6.23: EACH READER GIVEN CONTROL OVER A TRACK WITHIN ABLETON. .................................. 77
xvi
6.24: DICE GIVEN EXTRA FUNCTIONALITY: SOLO ("S") AND MUTE ("M").................................. 77
6.25: EACH READER CAN NOW READ BOTH THE TANGIBLE INSTRUMENTS AND THE DICE .......... 78
6.26 TANGIBLE OBJECTS WITH RFID STICKERS ATTACHED USED TO TRIGGER SONGS ............... 79
6.27: MATROYOSHKA DOLLS (LEFT). TINFOIL PREVENTS INNER DOLLS TAGS FROM BEING READ
(RIGHT)............................................................................................................................... 79
6.28: FACEBOOK VIDEO POST #3 ............................................................................................... 80
6.29: TECHNICAL SETUP FOR USER EVALUATIONS ..................................................................... 81
6.30: SCREEN CAPTURES OF USER #1 (LEFT) AND USER #2 (RIGHT) PERFORMING ..................... 82
6.31: USER #1 PERFORMING ALONGSIDE A LAPTOP RUNNING ABLETON .................................... 83
6.32: TOP DOWN DEPICTION OF STATIONARY DICE READERS AND MOBILE INSTRUMENT READERS
........................................................................................................................................... 86
6.33: ADAFRUIT BLUEFRUIT FEATHER LE MICROCONTROLLER ................................................ 87
6.34: ADAFRUIT ESP8266 MICROCONTROLLER POWERED INDEPENDENT OF USB VIA EXTERNAL
BATTERY PACK ................................................................................................................... 87
6.35: USING TUPPERWARE TO TEST IDEAL ENCLOSURE DIMENSIONS......................................... 88
6.36: SKETCHES OF SYSTEM CONFIGURATION ........................................................................... 88
6.37: 3D MODEL OF SYSTEM ...................................................................................................... 89
6.38: PERFORMERS VIEW OF 3D MODEL ..................................................................................... 89
6.39: ENCLOSURES WERE LASER CUT FROM MDF ..................................................................... 90
6.40: CONSTRUCTING THE DICE READER ENCLOSURE ................................................................ 90
6.41: ABLETON GUI AND PACKAGING WITH GREY AESTHETIC (LELONG.COM N.D.) .................. 91
6.42: ABLETON LOGO (LEFT) (SONARPLUSD.COM N.D.) AND GRASPABLETON LOGO (RIGHT) ... 91
6.43: STENCIL LOGO DESIGN WAS SCRAPPED IN FAVOUR OF A MUCH CLEANER ADHESIVE LABEL.
........................................................................................................................................... 92
6.44: AUDIENCE PERSPECTIVE OF FINAL SYSTEM (MINUS TANGIBLES) ...................................... 92
6.45: 3D PRINTED FIXTURES ...................................................................................................... 93
6.46: FULLY WIRED AND SOLDERED SYSTEM............................................................................. 93
6.47: FINAL SYSTEM (COMPLETE WITH TANGIBLES) AS DISPLAYED DURING D.A.W.N. ............ 93
6.48: COLOURED DICE (LEFT) CORRESPONDING TO COLOURED ABLETON CLIPS (RIGHT) .......... 94
6.49: EVEN WITH RELATIVELY SMALL AUDIENCES, VISIBILITY OF SYSTEM IS AFFECTED FOR THOSE
AT REAR ............................................................................................................................. 95
6.50: ONE USER DISCOVERS A NEW DJING TECHNIQUE ............................................................. 96
7.1: SOME OF THE UNPROMPTED COLLABORATIONS THAT TOOK PLACE DURING D.A.W.N. .... 101
xvii
9.1 MAX MSP PATCH RECEIVING OSC MESSAGES FROM IPAD AND CONVERTING THESE TO MIDI
FOR ABLETON................................................................................................................... 115
CHAPTER 1: INTRODUCTION
1
1. CHAPTER 1: INTRODUCTION
This thesis reports on the development of a tangible interface for the control of digital audio
workstations (DAWs); one that facilitates more direct bodily engagement than is afforded by
standard controllers. Expressive bodily movement is fundamental to music performance
(Davidson 2011). However, the current paradigm of controllers for computer-based
performance fails to appreciate this. Even though platforms such as Ableton Live1 have
extended the role of DAWs from studio-based recording to stage performances, the MIDI
controllers used to control them persist with traditional studio oriented controls such as faders,
knobs and buttons (Berndt et al. 2016). These controls lack the tangibility and tactility that is
necessary for live performance (Hayes 2012). Another often overlooked aspect of live music
is an audience’s appreciation of how an instrument is being performed with. Modern interfaces
such as laptops and MIDI controllers have relatively hidden interfaces, which can be confusing
to an audience who struggle to relate to what the performer is actually doing (Paradiso 1999).
Advances in research into tangible computing has brought a greater awareness of the
importance of engaging with the digital realm through physical interactions. Tangible user
interfaces (TUIs) are user interfaces where moveable, graspable physical objects can be used
as the means of control in lieu of buttons, knobs and faders, and as such, they may possess
considerable potential to redress shortcomings in control and representation within DAW based
music performance.
1 https://www.ableton.com/en/
CHAPTER 1: INTRODUCTION
2
1.1. The Body and Electronic Music
Before the advent of machines that could automate sophisticated processes, there was no
performance without the body (Ostertag 2002, p.11)
With the birth of digital audio workstations (DAWs) such as Pro Tools2 in the 1990’s, many
physical staples of the computer musician’s studio; synthesisers, effect units and the mixing
console, began to find their way to the computer screen as virtualised versions of themselves
(Whitcomb and Daly n.d.). The computer was becoming an instrument in its own right,
however this coincided with the introduction of the mouse and cursor as a means of control.
As DAWs such as Ableton Live3 have become ubiquitous, more and more musicians are
beginning to see the physical limitations of such input devices. As Ostertag (2002) laments,
computer-mediated music performance can severely inhibit movement:
With the emergence of the laptop as instrument, the physical aspect of the performance has
been further reduced to sitting on stage and moving a cursor by dragging one's finger across a
trackpad in millimetre increments. (ibid, p.12)
This minimisation of control has spawned a counter-culture of analogue modular synthesiser
enthusiasts who value the physicality of plugging and unplugging cables and twisting dials,
over the more convenient ‘at your fingertips’ software approach (Mok 2016). While this may
be a romantic alternative, the DAW endures in music performance for good reasons. It is highly
portable, it can memorize or recall settings instantly and it harnesses the power of a whole
hardware music studio in a small, convenient package (Yamaha 2018). The only flaw, from the
point of view of this thesis, is how DAWs are controlled and represented. As such, this research
suggests exploring ways to augment the DAW’s undoubted power in such a way that is more
usable for live performance.
Ableton is a DAW that has unique live performance functionality. This has quickly led to its
widespread adoption by live electronic music acts (Ableton 2012), so much so that it has
become ingrained into the culture of music performance itself. It’s creators Robert Henke and
Berhard Behles have observed that, on "any festival stage…almost 90 per cent of all laptops
2 https://www.avid.com/pro-tools
3 https://www.ableton.com/en/
CHAPTER 1: INTRODUCTION
3
are running our software.” (Slater 2016). This ubiquity renders Ableton an ideal entry point
for research into a fresh control outlook that may benefit DAWs overall.
1.2. The Spectacle of Live Electronic Music Performance
Research suggests that people’s perception of a performance also depends heavily on visual
information, and not just the auditory experience (Tsay 2013). The previous section already
underlined that there is a lack of bodily movement in the performance of electronic music,
however, this inhibition of movement also affects the audience and not just the performer. If a
performers movement is restricted, then obviously there will be a corresponding difficulty in
perceiving that movement by the viewer. The limited expressiveness of mechanisms such as
knobs and faders forces artists to exaggerate their movements by directing "exceptionally
intense expressivity toward a small, technical component associated with sound engineering"
(Butler 2014). An alternative system of control that makes the performers action more overt
would surely benefit both the audience and performer.
Another negative aspect to audience enjoyment of computer performance is the embodied in
the laptop itself. As Ableton has brought DAWs from the studio to the stage, so too has it
brought the laptop. The sight of a laptop can be visually uninspiring to an audience, as it is
more synonymous with the mundane than with any musical instrument (Cascone 2002). It can
also distract the performer in the form of screen gazing, resulting in a transfixed facial
expression known as ‘Serato Face’ (White 2013) (stemming from an earlier djing program
called Serato4). Furthermore, studies have also shown that laptop performances can be
perceived as pre-programmed and inauthentic (Pedersen, Hornbaek 2007). Lending credence
to this is a number of cases where prominent electronic music artists were understood to be
miming their digital controller based performances (Michaels 2013; Tregoning 2012).
With an awareness of these potential negative perceptions towards laptop performance, certain
Ableton performers have made concerted efforts to hide the computer entirely so that they can
instead focus more on actually performing for the crowd. The artist Panther Panther
summarises his personal reasons for closing his laptop to focus on his MIDI controller:
4 https://serato.com/
CHAPTER 1: INTRODUCTION
4
For a long time I have considered the screen to be detrimental to electronic music performances,
acting as a barrier between the artist and the audience. In this way, I was hoping to demystify
the performance by enabling audiences to see exactly how the sound was being manipulated
through specific hardware elements. (Villierezz n.d.)
Clearly there are benefits to a system that is more focused towards immersion in the
performance space that is shared with the audience, rather than immersion in a private virtual
world beyond the screen.
1.3. The Emergence of Tangible User Interfaces for Music
Tangible user interfaces (TUIs) were initially devised as an alternative to traditional computer
input and output devices such as the mouse, keyboard and the graphical user interface (GUI).
Where the GUI represents digital information via pixels painted to the screen, TUIs both
represent and provide the means to control digital information. In other words, TUIs can be
considered as user interfaces (UIs) that augment the real world by associating digital
information with everyday objects and environments (Ishii and Ullmer 1997).
While much early research was laboratory-based, TUIs soon found practical applications
within everyday life in areas such as the workplace (see URP p. 7) and in education (Marshall
et al. 2003). As TUIs deeply involve bodily movement, they have also been found to be
advantageous in more expressive domains such as music performance. In particular, they have
found success as tabletop music performance systems such as Reactable (Jordà et al. 2006).
Tabletop TUIs (TTUIs) are of particular interest to this research for two reasons; the first being
that they afford a richly expressive means of control, and the second being that their interface
is plainly visible to both the performer and to the audience. For these reasons, TTUIs offer rich
ground for exploration into remedying the aforementioned issues in control and audience
spectacle.
1.4. Primary Contribution
The primary contribution of this thesis is to propose a paradigm shift in how musicians interact
with DAWs, whereby the expressive bodily movement of the performer is brought to the fore
and the performance made more explicit to the audience. While there is much research to
CHAPTER 1: INTRODUCTION
5
suggest that TUIs are highly suited to music performance, much of this has hinged on their
application as standalone instruments (Jordà et al. 2006; Newton-Dunn et al. 2003; Alonso and
Keyson 2005). There is a noted lack of research into their application specifically for use with
DAWs, a gap that this thesis focuses on addressing.
The primary research question of this thesis is considered broadly:
• Can a TUI-based approach to control over DAWs be more suitable for music
performance?
Music performance is not a solitary activity but instead incorporates the point of view of
various stakeholders. Therefore, the research question is addressed more specifically by the
following sub-questions:
• Can a TUI for Ableton Live, with a control system consisting of graspable, moveable
physical objects promote a more bodily engaging means of control when compared to
traditional MIDI devices?
• By making the interface and the performer's actions overt through the use of physical
objects, can a TUI communicate a more open and legible performance to an audience?
These questions are explored from a number of angles. Firstly, a literature review reflects on
the failings of standard music controllers in juxtaposition with perceived benefits that were
learned in prior approaches to TUI design. Theories derived from this research are merged with
those gathered from empirical data to form the basis for a series of design excursions where all
aspects of a performer’s interaction with Ableton are attacked. A broad variety of tangible
interfaces are prototyped and evaluated, culminating in the design of an alternative hybrid
Ableton controller called GraspAbleton; one that is equal parts MIDI controller and TUI.
The primary contributions of this research are outlined as follows:
1. This research demonstrates that there exists an abundance of creative opportunities for
the use of TUIs with DAWs.
2. A design process that incorporates a brute force attack on design possibilities is shown
to have value in provoking exciting new directions for TUI design.
CHAPTER 1: INTRODUCTION
6
3. Broadly speaking, this research shows that there is fertile ground for fresh design
thinking in respect to interaction with DAWs in general.
1.5. Roadmap
The following chapters of this thesis are divided as follows. Chapter 2 provides background on
the accepted paradigm of control over Ableton, which serves to highlight areas that can be
improved upon through the application of a TUI. Chapter 3 offers a review of literature related
to the area of tangible interaction, indicating that, while TUIs have been widely applied in
music, there is little knowledge on their benefits for use with DAWs. This will also include a
technical examination of notable projects and controllers, followed by an outline of initial
empirical research undertaken prior to design. In chapter 4, the design methodology is outlined.
A research strategy that is largely based on a sketching in hardware approach is justified, and
chosen methods of data collection and analysis are subsequently discussed. Chapter 5 focuses
on detailing the design process, where an initial parallel phase of prototyping leads to a more
iterative and refining stage. This chapter also details the various chosen user evaluation
techniques. Finally, a concluding chapter will first offer an overview of the findings of this
research, and finally, it will identify and discuss promising areas for future design directions.
CHAPTER 2: CONTROL OVER ABLETON LIVE IN CONTEXT
1
2. CHAPTER 2: CONTROL OVER ABLETON LIVE IN
CONTEXT
To understand certain sections of this thesis, it is essential to underline the core functionality
of Ableton. As such, this chapter first offers a brief overview of Ableton's live performance
functionality. Secondly, as this project calls for a rethinking of the accepted means of control
over Ableton, a state of the art of traditional controllers is outlined. This serves to highlight
where traditional control mechanisms fall short and how these can be augmented or supplanted
by a TUI.
2.1. Ableton Functionality
Ableton has two contrasting user viewpoints: Arrangement View and Session View. A brief
examination of these two modes of interaction will demonstrate why its strength lies in live
performance.
2.1.1. Arrangement View
This view facilitates the arrangement of sections of music into a coherent overall composition,
mimicking the functionality of most other DAWs, where musical events occur along a
traditional horizontal measured timeline. A song pointer moves from left to right across the
arrangement of clips, triggering musical events as soon as it collides with them (2.1).
CHAPTER 2: CONTROL OVER ABLETON LIVE IN CONTEXT
2
2.1: Screenshot of the Arrangement View in Ableton Live
2.1.2. Session View
Session View breaks away from the limiting fixed timeline of Arrangement View by permitting
audio files to be mapped onto a grid-like structure as infinitely looping clips (2.2). Each vertical
track can play one clip at a time; however, several tracks that contain several clips on each can
be combined and played at once. These clips can be triggered on the fly in whatever sequence
the performer wishes, lending well to dynamic live improvisation. This is coupled with the
added facility for live mixing and the ability to control effects. It is this Session View mode
that the research of this thesis focuses on augmenting.
CHAPTER 2: CONTROL OVER ABLETON LIVE IN CONTEXT
3
2.2: Screenshot of the Session View in Ableton Live
2.2. Reviewing Ableton Live Controllers
This section critiques several prominent Ableton controllers, paying particular attention to their
means of control and the extent to which they can be used autonomously without the need to
reference a GUI. In keeping with the scope of this paper, an emphasis is placed on controllers
that are in some way designed for live performance. Having either owned or made creative use
of many of the below controllers at one time or another, the author will also interject with
personal ethnographic observations.
2.2.1. The Gold Standard: Generic MIDI Controllers
The ubiquitous generic MIDI controller (2.3) is for many DAW users the first progression
beyond mouse and keyboard control. These generally contain inexpressive controls such as
knobs, sliders and buttons, which are freely assignable and can therefore be quickly mapped to
any parameter within Ableton. The majority of models include LEDs on the facia, providing
constant visual feedback of the state of the controls, resulting in decreased reliance on the GUI.
CHAPTER 2: CONTROL OVER ABLETON LIVE IN CONTEXT
4
2.3 The author’s Korg Nanokontrol 25 generic MIDI controller
2.2.2. The Emergence of Ableton Specific Controllers: Novation
Launchpad
The Novation Launchpad6 offers even more expressive hands-on control in a live scenario.
This controller is geared primarily towards the live triggering of clips within Ableton’s session
view (2.2), and thus it functions quite unlike any previously designed controller. An array of
soft button pads illuminate different colours in accordance with the behaviour of each clip on
Ableton’s GUI, which can go some ways to enhancing the audience spectacle (if it’s facia is
made visible). Though its UI consists of only buttons, these have secondary functionality and
can therefore also act as pressure sensitive drum pads or keyboard keys. This facilitates a
certain degree of expression in live performance.
2.4: Novation Launchpad (Novation 2018)
5 https://www.korg.com/us/products/computergear/nanokontrol2/
6 https://global.novationmusic.com/launch/launchpad#
CHAPTER 2: CONTROL OVER ABLETON LIVE IN CONTEXT
5
2.2.3. Departure from the GUI: Ableton Push Controller
The Ableton Push7 controller offers an unprecedented level of control. Much like the
Launchpad, it provides an extensive array of touchpad buttons with secondary functionality.
However, with a large number of encoders and buttons, combined with a detailed visual
display, Push has access all of Lives functionality, so there is no longer any need to use a
mouse/keyboard or refer to Live’s GUI on the computer. In fact, as the company states, one of
the core goals of its design is just that:
It’s a powerful, expressive instrument that gives you hands-on control of an unlimited palette
of sounds, without needing to look at a computer. (Ableton 2018)
Even though, in reality, the model of interaction is merely a shift from a computer GUI to
another smaller LCD based GUI, the potential to eliminate the laptop and focus purely on the
instrument at hand is of particular interest to this research.
2.5: The LCD GUI of the Ableton Push (Ableton 2018)
2.2.4. Identifying a Gap for a TUI System
The above examples have demonstrated that there has been an overall progression in design
thinking towards more performance capable controllers for Ableton. The generic controller
was shown to consist of limiting buttons and knobs, while the Launchpad incorporates more
expressive pressure pads. Push was shown to makes further advances by eliminating a need to
reference the screen, thereby permitting more engagement between audience and performer.
However, all these MIDI controllers still in some way either impede movement or hide
movement from the audience. This research proposes that a TUI that is composed of large
visible objects can serve to rectify this. It is proposed that the movement of these objects will
7 https://www.ableton.com/en/push/
CHAPTER 2: CONTROL OVER ABLETON LIVE IN CONTEXT
6
transcend the capabilities of standard controllers in such a way that is more bodily liberating to
the performer and visually engaging to the observer (2.6).
2.6: Graph plotting extent of bodily engagement and spectacle enabled by standard controllers
CHAPTER 3: BACKGROUND RESEARCH
7
3. CHAPTER 3: BACKGROUND RESEARCH
3.1: Background phase of research
3.1. Introduction
As outlined in chapter one, the research of this thesis explores the design of a tangible user
interface for Ableton Live. In order to address this, the following chapter will examine
literature relating to the potential benefits of TUIs for music. This examination will begin with
an overview of their historical application, from the more broad conceptualisation of a TUI as
an alternative to the GUI to studies that underline the potential for TUIs in music performance.
This chapter will also discuss how TUIs offer both expressive control to the performer and a
more visually entertaining spectacle to the viewer when compared to laptop-based systems.
This is balanced with an ensuing discussion of literature relating to the perceived limitations
of TUIs, while finally, a review of contrasting technologies that have been applied in past TUIs
provides a basis on which to look towards technology as a guiding principle for design.
3.2. TUI Research
3.2.1. Foundations of Tangible User Interfaces
By drawing comparisons with the fields of augmented reality and ubiquitous computing,
Fitzmaurice et al. (1995) are widely credited with the first comprehensive exploration of the
CHAPTER 3: BACKGROUND RESEARCH
8
possibilities of tangible interaction. Using a working prototype called ‘GraspDraw’, the
authors’ work outlines how two graspable ‘bricks’ serve as physical handles that are tightly
coupled to digital information. These bricks are used to manipulate, move and resize virtual
2D objects on a tabletop graphical display. This work is of particular inspiration to this thesis
as it is the first to suggests a total departure from the tradition of mouse and keyboard user
input. The notion that the ‘bricks’ in question have the ability to eliminate the metaphor of the
mouse pointer all together is a powerful one. Particularly powerful too is their ability to permit
multiple simultaneous actions, while at the same time permitting the use of both hands. Though
perhaps limited by the technology of the time, the bold outlook was the catalyst for an emerging
stream of alternative thinking on how we can interact more naturally with digital information.
‘Bricks’ was further expanded upon by the seminal work in the field of tangible interaction,
‘Tangible bits: towards seamless interfaces between people, bits, and atoms’ (Ishii and Ulmer
1997). This work first introduced the term ‘Tangible User Interface’ (TUI) in referral to a
system where real-world objects offer the means to both represent and control digital
information. This work laments that computer interactions have developed in such a way that
fails to appreciate the natural physical “affordances” (Norman 1988) of touching real-world
artefacts. Along with physical interactions, there is a simultaneous preoccupation with the role
of "ambient media" to create a transition between background awareness and foreground
activity. Though ‘Tangible Bits' is not music focused per se, this notion has particular
advantages for musical applications, as TUIs permit a constant awareness of peripheral
elements of the system, even while focusing on foreground interactions.
While ‘Tangible Bits’ served as a proof of concept of their considerable potential, TUIs soon
began to find more practical uses. A system for urban planning called URP (Underkoffler and
Ishii 1999) can perhaps be considered the first TUI that was designed for practical purposes.
The system consisted of a virtual tabletop map that could be manipulated by moving tangible
representations of buildings around it. These buildings cast digital shadows, and they also
affected lines of simulated wind flow (through animated projections on the tabletop). The
system was conceptually intended to help urban planners with zonal development planning.
Leading on from early research such as ‘Bricks’ (Fitzmaurice et al. 1995) and ‘Bits and Atoms’
(Ishii and Ulmer 1997), scholars began attempts to further classify the individual elements of
TUIs so that they might be more effectively applied by other designers. Holmquist (1999)
suggested that TUIs could be better understood and therefore designed if their elements were
CHAPTER 3: BACKGROUND RESEARCH
9
deconstructed and separated into distinct categories. The TUI system was broken down into
elements that Holmquist refers to as tokens, containers and tools. He outlined that containers
are objects that are tied to digital information which serve to move information freely around
the system, while tokens are classified as physical objects that somehow resemble the
information they represent and are used as a means of accessing that information. Finally, tools
are described as a means by which the user can manipulate this digital information, aided by a
computer.
Ullmer and Ishii (2000) attempted to distinguish TUIs from GUIs through the introduction of
the Model Control Representation Physical and Digital (MCRpd) model (later renamed
MCRit); a modification on the Model View Control (MVC) model for GUIs. The MCRpd
model set GUIs and TUIs apart once and for all by outlining that GUIs represent information
via intangible pixels on a screen that is controlled by input devices such as a mouse, while TUIs
make information readily graspable and manipulable via haptic feedback, thus giving physical
representation to digital information (3.2). TUIs not only represent information, but they also
allow control over that information, and therefore, interaction with TUIs is much more akin to
interaction with real-world objects.
3.2: The MCRpd Model (Ullmer and Ishii 2000)
3.2.2. TUIs: From Practical Use to Music
Aside from early practical applications such as URP, evidence was also emerging suggesting
that TUIs could find a place in more artistic endeavours. Music Bottles (Ishii et al. 2001) is the
first application of a TUI to music. The setup consists quite simply of a table onto which three
bottles are placed. When any of these bottles are uncorked, one of three sampled instrument
riffs plays out - that of either a cello, a violin or a piano. The idea is that each bottle releases
the digital content within it, and so the notion of a bottle as a container is playfully explored.
CHAPTER 3: BACKGROUND RESEARCH
10
While music bottles can be considered to be a theatrical artefact more so than a performance
system, its concept is essential to the research of this thesis as it demonstrates that TUIs can
insert digital meaning into everyday physical objects while maintaining the original conceptual
model of that object. This notion of associating meaning to digital media through tangibility is
also explored in later projects such as Skal (Arnall and Martinussen 2010) and Music Trinkets
(Paradiso et al. 2000), which are further detailed in section 3.5 of this chapter.
ReacTable (Jorda et al. 2006) is the most prominent example of a music performance-focused
TUI. The setup consists of physical markers that are placed atop a transparent screen, while a
camera beneath the screen tracks the movement and orientation of the markers. Each marker
represents one of several typical building blocks of audio synthesis such as a sound source (e.g.
an oscillator) or an effect (e.g. a filter). The spatial distribution, rotation and proximity of the
markers to one another govern the resultant shifts in sound, which is all done in real time. For
visual feedback, a projector visualises animations of both the current state of each object and
the connections between different objects. The overt display of both the digital information and
tangible physical control mechanisms means that both the performer's actions and the overall
state of the interface are plainly visible to the audience (3.3).
3.3: Reactable (Reactable 2018)
CHAPTER 3: BACKGROUND RESEARCH
11
3.3. Advantages & Disadvantages of TUIs for Music Performance
3.3.1. Advantages
3.3.1.1. Confounding the Privileging of Cognition over Action
As discussed in chapter 1, interaction with electronic music instruments has shifted from direct
and expressive to remote and restrictive. Djajadiningrat et al. (2007) argue that this shift is also
evident in the broader context of design for consumer products in general. The authors
summarise that, as consumer products increasingly feature inbuilt computer technology,
accordingly, interactions begin to involve cognitive processes more so than expressive bodily
movement:
An overview of the historical development of commercial products in the 20th century shows
how products have increasingly neglected our perceptual-motor skills, have burdened our
cognitive abilities, and have lost their physical expressivity. (ibid, p. 2)
The authors go on to observe that products are trending towards designs that prioritise ease of
use over the joy of use, and as a result movement is no longer included as a meaningful
component of interaction:
In interaction design, only the movement to operate the controls is considered functional, with
the movements in between forming time-consuming necessities. (ibid, p. 11).
Their overall argument is that more emphasis should be placed on the latter in order to permit
more meaningful and joyful interactions.
The research of this thesis considers that the observed trends above are also evident in design
for electronic music performance controllers, as the majority still maintain the buttons, knobs
and sliders paradigm and as such are not focused towards expressive use. This trend can be
subverted through the application of TUIs. As Jordà (2008) has identified, one of the core
benefits of TUIs is that they promote bodily engagement through expressive, skilled and
explorative interactions; characteristics that make them ideally suited to music performance.
Therefore, it is argued that this rich bodily involvement which privileges direct engagement
above cognition can guide the design of a more usable and ultimately a more enjoyable music
controller.
CHAPTER 3: BACKGROUND RESEARCH
12
3.3.1.2. TUIs for the Enhancement of Audience Spectacle
While Marshall et al. (2007) indicate that further empirical research is needed before the
benefits of TUIs can be truly understood, one area in which TUIs have proven advantageous is
group interaction (Jorda et al. 2006). With respect to group-led musical applications, through
a sustained study of group interaction with the Reactable (Xambó et al. 2013), TUIs were found
to promote group coordination in a situated context while the musical interaction was
understood to foster learning. Hornecker (2002) underlines that amongst the most critical
enabling factors for group use of TUIs are "haptic direct manipulation" and "constant
visibility". Constant visibility of the shared interaction spaced is essential to gestural
communication between users, whereas haptic direct manipulation in bodily shared space
makes user interactions with tangible objects visible and understandable to the rest of the group.
Several of the reasons for which TUIs are of benefit to groups also highlight how TUIs can
benefit the audience. What Hornecker (2002) refers to as haptic direct manipulation, means
that the performer's interactions with a TUI are not just made evident to other performers, but
also to the audience. In this way, it can be considered that performer interactions with TUIs are
much more open and engaging to behold than a typical laptop-based performance, which can
be perceived as dull and inspiring by an audience (Cascone 2014). Furthermore, Pedersen and
Hornbæk (2009) found that laptop interactions can suffer from a lack of legibility and
understandability from an audience perspective, which can give the impression that the
performer is either being passive or worse still, faking the performance entirely. To extend the
notion of audience involvement even further, TUIs, in specific settings, can also be used to
invite the audience to take an active part in the performance. Shaer (2009, p.42) refers to the
potential “enticingness” of TUIs, which, through visibility of interaction and low entry
threshold, can invite the audience to interact themselves (if permitted).
3.3.1.3. TUIs Make the World Itself Your Interface
Unlike inflexible input methods such as the mouse or MIDI controllers that have hardcoded
controls, TUIs have considerable potential for personal modification. As demonstrated in
Music Bottles (Ishii et al. 2001), TUIs can be constructed from everyday objects. They do not
CHAPTER 3: BACKGROUND RESEARCH
13
need to adhere to a pattern of rigidly predefined tokens that permanently represent the same
functions each and every time the system is used. After all, our living environment is full of
potential tangibles that could be used as an integral component of a TUI. Several approaches
to TUI design have exploited this fact by facilitating a much looser method of associating
tangibles to computational functions. For example, Dalton et al. (2012) introduce a system that
employs a broad range of re-appropriated objects for use as improvised tangibles.
Carvey et al. (2006) took the concept of freely associating meaning to everyday objects and
coupled this with relatively simple technology to form the basis of an open source system that
would allow everyday users to build bespoke TUIs themselves. With a set of cheap hacked
weighing scales and a computer program named Amphibian, objects could be linked to
customisable computer tasks, ranging from simple shortcut operations to more complex macro
commands. In this way, for example, a frisbee, when placed upon the scales and its weight
recognised, could be used to open vacation photos. This research demonstrates that
technological limitations are no barrier in assigning meaning to everyday objects.
Mugellini et al. (2007) suggest a framework to allow personal objects to be used flexibly in a
TUI system for the purpose of memory recollection as well as for sharing. They suggest that
personal objects, through their familiarity to the user, can make this proposed user interface
much more understandable and more accessible to use:
Use of personal objects as tangible interfaces will be even more straightforward since users
already have a mental model associated to the physical objects thus facilitating the
comprehension and usage modalities of that objects. (Mugellini et al., p. 231)
The authors put forward a system of augmented personal objects called ‘Memodules’ as a
platform to both collect memories and to share them. What is of particular interest to this thesis,
and what makes the Memomodules system unique, is that the authors are also aware that
meaningful associations between individuals and objects are prone to change over time. For
this reason, their system supports the reconfiguring of new associations as memories fade or
change.
The above body of literature demonstrates that one does not have to accept a TUI system of
tokens that were designed by a third party. By instead attaching representation to objects based
on one's own experience of the world, the surrounding environment can be transformed into a
truly unique personalised interface.
CHAPTER 3: BACKGROUND RESEARCH
14
3.3.2. Limitations of TUIs
Though this section has so far discussed only the positive aspects of TUIs, an examination of
their potential limitations will serve to give an understanding of how they might be more
effectively applied.
3.3.2.1. Inability to Save State
A primary limiting factor of TUIs is the fact that they exclude certain conveniences that people
have become accustomed to through the use of computer-based technologies. One such pitfall
is that they suffer from an inherent inability to load or recall workspace configurations. For
many, this is a fundamental feature of DAWs, where templates for multiple individual live sets
can be pre-configured and saved, to be subsequently recalled and reproduced identically at the
time that a set is due to be performed. However, there is evidence that a body of research has
explored solutions to this situation. Projects such as Pico (Patten and Ishii 2007) and Zooids
(Le Goc et al. 2016) reflect efforts to navigate this shortcoming by designing TUIs that employ
mechanically actuated tangibles. These tangibles have the ability to move so they can return to
a predetermined state without the need for human intervention.
3.3.2.2. Subjectivity
Another factor that merits consideration is that, while TUIs have certain advantages, many
performers still find a great deal of joy in using laptops for musical applications. For example,
Fiebrink et al. (2007) have celebrated the ease of use of the laptop as an instrument and outlined
that surprisingly expressive music can be performed using only the laptops native inputs - the
trackpad and keyboard - as a means of control. As further evidence, there is an ever-growing
community of computer programmers who find immense joy in composing live music with
only the keyboard as an input device (Collins et al. 2003). Similarly, there is evidence of strong
support for long-standing staples of interaction with computer music, the knob and slider.
Though each of these may seem to offer quite a limiting level of control, Hunt et al. (2002)
found that by mapping a single control to more than one musical parameter (i.e. no longer a
one-to-one mapping), highly musical results can be obtained. Other research exists which
CHAPTER 3: BACKGROUND RESEARCH
15
demonstrates that the physical affordances of knobs and sliders can foster creativity through
constrained movement (Gelineck and Serafin 2009).
Undoubtedly TUIs do not offer the most comprehensively convenient solution for music
performance, nor do they offer a broad solution that suits all performers. However, by focusing
on a design that exploits their advantages, TUIs can be of particular benefit to the performer
who prioritises bodily liberation over convenience.
3.4. Music TUIs - Technological Considerations
This research probes the question of how to facilitate more tangible interactions with Ableton
through technologically driven exploration, therefore it is important to offer a state of the art
of conventional technologies that have been applied to TUIs in the past. Technological
approaches of varying complexity can serve as a guide for design, depending on whether or
not the TUI requires highly expressive control.
The earliest known conceptual TUI design; the Marble Answering Machine (Polynor 1995),
depicts the physical representation and manipulation of digital information very simply through
spherical marbles. Similarly, Bennett (2011) uses spherical ball bearings to represent and
manipulate step sequencer note triggers in his Beatbearing performance instrument. Though
the Beatbearing’s technical setup is elaborate, the technology applied to sense that a tangible
token (a ball bearing) has been placed upon the sequencer is quite simple. When a bearing is
placed upon a step, it connects two copper contacts and closes a switch, thus informing the
system that the chosen step is occupied. As such, this basic functionality is justified in that it
perfectly matches the TUIs intended purpose.
Camera tracking technology has long been used in TUI designs where control of a much higher
resolution is desired. For example, its application in the TUI music sequencer ‘Scrapple’ (Levin
2006) enabled sounds to be triggered in a highly loose fashion. The position of the objects
could be tracked anywhere along the length of the performance area, in comparison to the
Beatbearing which was physically constrained to sixteen steps. This high detail in tracking is
also ideally suited to TUIs such as ReacTable (Jorda et al. 2006), which require a more gradual
and expressive means of control.
CHAPTER 3: BACKGROUND RESEARCH
16
Short-range wireless technologies have also found some usefulness for application within
TUIs. RFID technology, in particular, is generally considered to be quite robust, though it
seems largely forgotten in contemporary literature on TUIs. Unfortunately, it does possess
certain technological disadvantages, mainly arising from possible errors or a lack of detection
(Ullmer et al. 2005). Despite this, a considerable advantage of RFID for TUIs is their ability to
associate objects with computational processes simply and elegantly. Concerning their
application in music performance, this may serve well for changes; however, the addition of
supplementary technologies are required to facilitate any gradual or expressive means of
control.
3.5. Related Projects
This section will detail several notable TUI projects that were not previously mentioned in this
thesis. The reasons for which these have inspired research, either conceptually or
technologically, are also discussed.
3.5.1.1. Skål
Skål (Arnall and Martinussen 2010) is a media player designed to play back video sequences
based on their association with specific real-world objects. It uses RFID technology to sense
physical objects to which RFID tags have been affixed. The project proposes that a link can be
made between physical objects and digital media based on meaningful associations. For
example, a figurine of a cartoon character will trigger that particular cartoon to begin playing
once it is placed into the Skål (bowl) reader.
In an age where digital content has become somewhat worthless and throwaway, Skål elegantly
demonstrates that digital media can be given meaning once more through physicality and
tangibility. Though seemingly quite technologically and theoretically basic, this project was of
immense inspiration to much of the technological exploration outlined further on in this thesis.
CHAPTER 3: BACKGROUND RESEARCH
17
3.4: Skål (Creative Applications Network 2009)
3.5.1.2. Collaborative: IDEO's C60 Music Player
Elaborating on the theory outlined in Skål, the design consultancy IDEO designed a dynamic
media player called C60 (Hartmann 2011) where RFID tags are used to create tangible music
playlists. Links to streamed music are stored on RFID enabled cards, and users play this music
by dropping the cards onto the surface of the music player. If multiple cards are dropped onto
the surface of the player at once, then the music contained on all of these cards is combined to
form a playlist.
This project is noteworthy as it brings physicality back to the experience of casually listening
to music. Though the underlying music is still digital in character, the system serves as a
reminder of the forgotten joy and ritual involved the physical manipulation of tangible music
media such as records and cassette tapes. The research of this thesis considers that C60 player
is not merely a novelty but a valuable project that contradicts the throwaway attitude to digital
music that has emerged with the introduction of new technologies such as streaming.
CHAPTER 3: BACKGROUND RESEARCH
18
3.5: C60 Music Player (Designboom 2010)
3.5.1.3. Musical Trinkets
In the Musical Trinkets project (Paradiso et al. 2000), various small tangible objects are used
as control mechanisms in a musical composition. The ‘trinkets’ are an assortment of small toys
embedded with magnetically-coupled resonators which act as RF tags. A circular coil is driven
at various frequencies to create an AC magnetic field into which the tagged toys are introduced.
Each tagged object is administered its own resonant frequency, and so by Sweeping the coil
through a range of frequencies, each tag can be individually identified simultaneously. Each
object is then used to trigger different tones and harmonies within a musical soundscape
performance. This project demonstrates the capability and the power of using everyday objects
as tangible tokens within a musically performable TUI.
3.6: Musical Trinkets (Paradiso et al. 2000)
CHAPTER 3: BACKGROUND RESEARCH
19
3.5.1.4. Music Blocks
Music Blocks (Sosoka et al. 2002) is a commercially available music toy that is aimed at young
children. It consists of a base platform and a set of five colourful blocks. Within the system,
each block represents a unique melodic musical phrase. The base contains five openings, and
a musical sequence is created by placing the blocks within these openings in any desired order.
Timbral changes are also made possible by changing the orientation of the blocks. This product
highlights the appeal of the cube as a graspable manoeuvrable object, the shape and size of
which render it an engaging tangible token that encourages play and discovery.
3.7: Music Blocks (Fatbraintoys 2018)
3.6. Empirical Research
Before commencing the design phase, it was considered that empirical research into the
practices of the core stakeholders of this research - Ableton performers - would provide a useful
perspective to inform further designs. This viewpoint was probed via two informal interviews
and a brief questionnaire. The findings from both undertakings are outlined here.
3.6.1. Interviews
Before this research had begun, two Ableton based live music performances were attended.
These were noteworthy as they featured extreme use cases of Ableton. In both cases, the
performers physically repositioned the laptop and MIDI controller in order to alleviate what
CHAPTER 3: BACKGROUND RESEARCH
20
they considered to be flaws in the control and the audience spectacle of their Ableton based
performance. This section will detail feedback from interviews with the artists that took place
subsequent to their performances. These serve to underline shortcomings in Ableton based
laptop performance from the practising musician’s perspective. These insights will have a
significant influence on future design.
3.6.1.1. Extreme Use Case #1
3.8: Pre-gig photo depicting the duo's laptop beneath the table (photo courtesy of the artists)
3.6.1.1.1. Case #1 Summary
The first performance was by a live techno duo, who use both Ableton and hardware
synthesisers in their performances. During one of their gigs, it was observed that their laptop
(which was running Ableton) was hidden under the table (3.8). The pair were subsequently
contacted for further insights as to why this was done. They responded with three primary
reasons:
Reason 1:
• “We didn't like how a laptop looks on stage - it being a tool pretty much everyone uses
and for pretty much anything you can think of. It just seems a little too every day and
miscellaneous to have a machine you use to do emails and online banking alongside a
bunch of purpose built music machines."
Reason 2:
CHAPTER 3: BACKGROUND RESEARCH
21
• “To prevent the audience from thinking the music was coming from a laptop, and
therefore rid any potential suspicion that it could be just a DJ/mp3 show instead of a
'live' show.”
Reason 3:
• “To distance ourselves from a focal point which may unnecessarily occupy our
attention and distract ourselves from focusing on the gear”
3.6.1.1.2. Case #1 Analysis:
The first two responses shed light on certain issues for electronic music performance from the
audience’s point of view. The first echoes Cascone’s (2002) opinion that the laptop is too
synonymous with the mundane to be considered suitable for music performance, while the
second relates to Pedersen and Hornbaek’s (2007) study, where audiences were seen to doubt
the ‘liveness’ of laptop performance. Their final reason demonstrates that the laptop also poses
an issue for performers in the form of an unnecessary distraction.
3.6.1.2. Extreme use case #2
3.9: Interviewee performing with controller tilted towards the audience (photo courtesy of the artist)
3.6.1.2.1. Case #2 Summary
Similar to the duo mentioned above, the next artist also hid his laptop from view while
performing. However, this performance was notable more so for the manner in which he
interacted with his MIDI controller. Bizarrely, his MIDI controller was not placed upon a
CHAPTER 3: BACKGROUND RESEARCH
22
tabletop, but instead on an instrument stand with its facia tilted towards the audience. As such,
during the performance, it was observed that his gestures and stance alluded more to those of
a guitar player than a laptop musician. When this performer was asked why he performed in
such a way, he cited two main reasons:
Reason 1:
• He wanted the controls to be visible to the audience so that they would be better able
to see (a) his performance actions and (b) the spectacle of the controller’s LEDs
changing colour in accordance with his actions.
Reason 2:
• He felt that it was much more ergonomically comfortable to tilt the unit, and in this
way, it also felt more like he was performing with a real musical instrument.
3.6.1.2.2. Case #2 Analysis:
This performer’s insight underlines that the MIDI controller poses certain issues in live
performance that affect both the audience and performer. His response indicates that gestures
with tabletop MIDI controllers are predominantly hidden from the audience, and therefore may
lack visual appeal. Secondly, he also suggests that Midi controllers fall short of the feel of real
instruments.
3.6.1.3. Overall Analysis of Interviews
From these interviews, the perspectives of the artists are summarised as follows:
Performer perspective:
• The laptop can be a distraction.
• Midi controllers fall short of the feel of real musical instruments.
Audience perspective:
• Laptops are too synonymous with the mundane to be an engaging spectacle.
• Audiences do not trust laptop performances.
• Midi controllers hide the performer's actions from the audience.
CHAPTER 3: BACKGROUND RESEARCH
23
These observations have indicated that certain artists feel the need to augment their setup in
order to navigate perceived shortcomings in Ableton performance. However, to obtain a more
elaborate impression of user perspectives and a more balanced view, this information was
synthesised into a brief user survey that was distributed amongst a much broader demographic.
3.6.2. Ableton User Survey
A concise survey of 7 questions (outlined in Appendix 9.1.1) was distributed amongst an
international Ableton user group on Facebook. The survey related to areas that were deemed
to require further clarification, such as:
• The perceived impact of the computer screen on live performance.
• The perceived need for overtness when performing with Ableton.
• Audience interaction in Ableton performances.
The survey was conducted between June 20th and June 30th, 2018. The target group were users
who had previously performed in front of a live audience with Ableton. In total, 36 replies were
collected. The questions were given in the form of a Likert scale, where users were asked to
respond to strong statements on a seven-point scale. It was considered that this method would
give an accurate indication of user attitudes.
3.6.2.1. Survey Findings
The most relevant findings are as follows:
• Many respondents attested that the laptop does indeed distract them from the
performance.
• Many respondents believed that Ableton controllers obscure their performance actions.
The survey outlines that many users felt that they had to compensate for this through
exaggeration.
• There was considerable agreement that the laptop on stage is detrimental to the energy
between the audience and performer.
CHAPTER 3: BACKGROUND RESEARCH
24
3.6.2.2. Survey Limitations
This survey indicates that many users perceive certain failings when performing with Ableton.
However, it served only as a rough overall guide for design because, as it was shared to an
Ableton user forum, it is possible that the respondents were fans of the program, and therefore
they may be less likely to criticise.
3.7. Background Conclusion
The background research presented in this chapter has highlighted that TUIs, despite certain
limitations, are ideally suited to music performance. To address the research questions of this
thesis, this was considered from the perspective of both the performer and the audience. A
further examination of traditional control methods for computer music performance
highlighted that they compound this lack of bodily movement and lack of spectacle. The insight
of core stakeholders was then considered in order to form a more involved overall view of these
issues.
The primary findings of this background research are now presented in the form of a table,
which clearly outlines several primary concerns while suggesting how each can be alleviated
through the application of an alternative TUI based control system. The ensuing design phase
will be tasked with addressing each of these concerns.
Issues with Traditional Control Methods How TUIs can Address These Issues
Laptop screen distracts performers TUIs do not require a screen
Midi controllers lack the feel or real
instruments
TUIs promote expressive hands-on
interaction that is akin to a real instrument
Midi controllers require exaggerated
movement to be perceived by an audience
TUIs permit more expressive, visible
movement
Laptops are not an engaging spectacle to an
audience.
TUIs with an intriguing interface
consisting of curious objects can be more
engaging to an audience
Laptop & MIDI controller performances
obscure performers actions
A TUI system can make performers
actions more overt
Audiences distrust laptop performances A TUI system can eliminate the laptop
from the setup entirely Table 1: Primary findings of background research
CHAPTER 4: METHODOLOGY
26
4. CHAPTER 4: METHODOLOGY
To recap, the research questions of this thesis are outlined as follows:
• Can a TUI for Ableton Live, with a control system consisting of graspable, moveable
physical objects promote a more expressive means of control when compared to
traditional MIDI devices?
• By making a performer's actions overt through the manipulation of physical objects,
can a TUI for Ableton Live communicate a more open and legible performance to an
audience than those that are confined to buttons, knobs and sliders?
This research approaches these questions in two ways. Firstly, a review of relevant literature
and empirical data, provided a broad contextual understanding of how traditional interaction
with DAWs inhibits bodily movement, while discussing how TUIs can alleviate this. Secondly,
a series of design studies were conducted which followed a 'sketching in hardware' approach.
These prioritised building and testing over thinking and meeting and formed the basis for a
brute force attack on the possibilities for tangible interaction with Ableton. This chapter will
now detail the chosen methods of data collection for this research and justify their reliability
and validity.
4.1. Sketching in Hardware
Much of the research of this thesis was conducted via a 'sketching in hardware' (Holmquist
2006) approach; a techniques that is similar to sketching in paper, yet it instead focuses on
sketching with tangible electronics to explore design ideas in an open and explorative fashion.
By following this approach, basic working prototypes were continually developed in order to
gather feedback and move the design process forward as quickly as possible. This was
facilitated by physical computing platforms tailored towards rapid prototyping such as Arduino
which, as Bauer (2010) states, make it extremely quick to sketch in hardware and build basic
functional prototypes.
CHAPTER 4: METHODOLOGY
27
Even though many of these prototypes were designed rapidly, it was still a labour intensive
endeavour. Therefore, given the time constraints, a strict user-centred design approach to data
collection which involved heavy user input from the outset did not seem appropriate. It was
felt that regular workshops and focus groups would slow down the creative process. Therefore
a bias toward action (Bias Toward Action n.d.) approach was preferred, whereby the creation
of tangible ideas was given precedence over meticulous user research.
4.2. Approach
This research is intended to benefit other Ableton users, and as such, user input was required
throughout; particularly in order to test the aforementioned explorative prototypes. As
O’Modhrain (2011) outlines, it is difficult to evaluate the design of digital musical instruments
(DMIs) due to the various stakeholders involved (e.g. audience, performer, designer). Several
studies (Stowell et al. 2009; Gelineck and Serafin 2009) have highlighted that quantitative
usability measures are ill-suited to the evaluation of DMIs for music performance, arguing that
the subtle and nuanced nature of performance cannot be condensed into overly simplistic data.
As Stowell (2009) outlines:
Musical interactions have creative and affective aspects, which means they cannot be described
as tasks for which, e.g. completion rates can reliably be measured. (ibid, p.1).
Therefore it was considered that a qualitative approach to data collection, where open questions
are subsequently analysed through discourse analysis, as proposed by Stowell (ibid), would
encourage more insightful feedback and provoke more innovative ideas.
4.3. Data Collection Methods
During this study, several different data collection methods were employed. These ranged in
complexity from highly informal to highly focused, depending on the level of scrutiny required
at the time. Similarly, candidates from different backgrounds were chosen at different stages
depending on data requirements. At certain times, a mix of candidates from both musical and
non-musical backgrounds were chosen so as to provoke more open design ideas, while in
subsequent tests where technological or live performance aspects were to be scrutinised, only
experienced Ableton users were recruited.
CHAPTER 4: METHODOLOGY
28
The separate empirical approaches that this study followed are outlined in the tables below:
Research Phase
Strategy Approach Aim Sample Question
Types
Analysis
Method
Online
Survey
Quantitative To understand if the
laptop poses an
issue for Ableton
performers.
36 Ableton users
who have
performed live
in the past.
Likert scale
questions.
Descriptive
analysis of
bar chart.
Semi-
Structured Interviews
Qualitative
To understand
performance issues
with MIDI
controllers & why
performers tend to
hide their laptop.
2 artists who are
Ableton
performers - (Extreme use
cases)
Open
questions.
Content
analysis.
Table 2: Data collection strategies for research phase
Sketching in Hardware Phase
Strategy Approach Aim Sample Question
Types
Analysis
Method
Online
Crowdsourcing
Via video posts
on social
media.
Qualitative
Quick feedback
on design
concepts from
broad
demographic.
Shared amongst
Ableton user
group & friends
who are
Ableton users.
Open & non-
specific
questions
(e.g. “Do you
like this
design?”)
Basic gauging
of enthusiasm
via comments
Informal
Demonstrations Qualitative
Rapid feedback
on design
concepts via
prototype demos.
10+ Ableton
users, friends &
colleagues.
Open
questions
Basic gauging
of levels of
enthusiasm
Group Testing -
Radical
Collaboration
Qualitative
To obtain diverse
insights from
users of varying
backgrounds
4 candidates
(Mixed Ableton
and non-
Ableton users)
Open
questions
Discourse
analysis.
Video
Recording &
Interview
Qualitative
To observe user
performance
actions. To obtain
feedback on how
to make the
system musically
performable.
2 Ableton
performers
Open
questions
Reviewing
footage and
documenting
user
interactions.
Discourse
analysis.
Table 3: Data collection strategies for sketching in hardware phase
CHAPTER 4: METHODOLOGY
29
4.3.1. Detailing Chosen Data Collection Methods
4.3.1.1. Online Survey
Reason for Method Choice
A survey is "a comprehensive system for collecting information to describe, compare or explain
knowledge, attitudes and behaviour." (Kitchenham and Pfleeger 2002, p.16). In order to
compare the attitudes and behaviour of a large number of Ableton users towards the laptop and
MIDI controller in live performance, a concise survey of 7 questions was distributed amongst
an international Ableton user group on Facebook.
How Sample was Chosen
The only criteria were users who had performed live with Ableton in the past. However,
recipients were asked to state their age, preferred music genre and overall experience with
Ableton in order to determine if any of these had a bearing on the retrieved data.
Questions Posed
The questionnaire consisted of Likert scale themed questions, where respondents were asked
to indicate their agreement with a particular statement using a seven-point scale, which ranged
from "strongly agree" to "strongly disagree". The survey covered areas that needed more clarity
such as:
• Preferred means of control over Ableton Live.
• The perceived impact of the computer screen in a live performance.
• The perceived need for overtness of performer actions.
• Audience interaction.
Validity & Reliability
A potential pitfall of online surveys is a low response rate (Evans and Mathur 2005). This issue
was tackled by first making the survey extremely concise and secondly, by introducing it in
such a way that would be appealing or intriguing to respondents (see Facebook posts in
Appendix A: Survey). It has been noted that with surveys, the obtained results may be lacking
in sufficient depth or details on the topic at hand (Kelley et al. 2003), however, in the case of
CHAPTER 4: METHODOLOGY
30
this research, this was considered to be heavily offset by a high volume of responses from an
extremely wide demographic.
An observed limitation to this survey is that, as questions were highly critical of Ableton Live,
it is possible that this criticism may have led to biased responses. As the survey was shared
within an Ableton live user group, it is likely that respondents were fans of the program, and
therefore they may have been less likely to criticise the platform.
4.3.1.2. Telephone Interviews
Reason for Method Choice
Two brief telephone interviews were conducted in order to query Ableton performers about
their technical performance setup informally.
How Sample was Chosen
Before this research was begun, two exceptional live music performances involving Ableton
were attended where either the laptop or MIDI controller was being used in unorthodox fashion.
It was subsequently decided to contact the musicians involved in order to obtain deeper
insights.
Questions Posed
Questions were asked in an open fashion, focusing on the role of the laptop and MIDI controller
in live performance. It was clear that both sets of interviewees saw limitations with
laptop/MIDI controller based performance by the fact that they devised elaborate workarounds.
Questions asked were an attempt to probe this further.
Validity & Reliability
Though the absence of visual cues might result in the loss of some contextual data (Novick
2008), a telephone-based approach was chosen to make respondents feel relaxed enough to
disclose sensitive information more openly.
CHAPTER 4: METHODOLOGY
31
4.3.1.3. Crowdsourcing
Reason for Method Choice
Crowdsourcing is a web-based method of evaluation that potentially provides access to the
opinions of millions of individuals.
How Sample was Chosen
In the middle stages of design, several videos of prototypes were shared online via Facebook
status updates or to Ableton specific Facebook user groups.
Questions Posed
These posts contained a short description and provocative questions which were intended to
encourage commentary. Several examples of these posts are given in Appendix A: Survey Posts
on Social Media.
Validity & Reliability
Though it was found that responses lacked depth (e.g. respondents giving “likes” without any
further commentary), the method succeeded in opening up the design process to criticism from
a broad user range at an early stage. As Bennett (2011) has observed, online crowdsourcing in
this manner serves as “a way to elicit more feedback from potential users than would be
possible by any other means.” (Bennett 2011, p.108)
4.3.1.4. Informal Demonstrations
Reason for Method Choice
In much the same way as crowdsourcing, informal demonstrations of basic prototypes served
as a means to obtain rapid feedback on design concepts.
How Sample was Chosen
These were conducted primarily amongst a circle of friends and acquaintances who are avid
Ableton users.
Questions Posed
CHAPTER 4: METHODOLOGY
32
Tests typically consisted of a quick demonstration and an invitation to interact followed by a
debriefing.
Validity & Reliability
Though these tests were highly informal and unstructured, they were generally found to be
useful in quickly gauging the first impressions of the core stakeholders; Ableton users.
4.3.1.5. Group Evaluation
Reason for Method Choice
Group evaluation following a 'thinking aloud' (Van den Haak et al. 2004; Als et al. 2005)
approach was intended to encourage inter participant conversations, which would possibly
uncover more profound insights.
How Sample was Chosen
Users of a mixed musical background were chosen as it was believed that, if the system were
usable to the uninitiated, then it would most likely also be usable to an expert.
Questions Posed
At this stage, prototypes were purely conceptual, and therefore ways in which they might be
used was still open to interpretation. For this reason, Senger and Gaver's (2006) method of
leaving their intended use open to multiple interpretations was applied:
No single one of these perspectives may necessarily be “correct; instead, all may be useful in
highlighting aspects of how systems will be understood, be used, and find roles in individual’s
and community’s lives. (ibid, p.3)
Accordingly, at the outset, users were given minimal guidelines on how to use the system and
their subsequent interactions observed.
Validity & Reliability
It was found that treating the user as the expert in this fashion was effective in learning
unexpected new ways in which the prototype could be interacted with.
CHAPTER 4: METHODOLOGY
33
4.3.1.6. Expert Reviews
Reason for Method Choice
Video recording serves to document data that can be missed by interviewing alone (Mackay
2002). As this later phase of design was focused on how to apply the system in a performance
context, tests were moved to a more controlled environment where video recording could
document user interactions that were more nuanced.
How Sample was Chosen
Expert commentators were deemed more suitable as test candidates. Therefore, two users who
had extensive knowledge of live performance and Ableton were chosen.
Questions Posed
An interview was conducted with each candidate in talk aloud fashion, both during and after
testing the system. Questions posed related to the usability of the system from a music
performance perspective. Users were asked to be critical in recommending changes that would
enhance the system’s musical capabilities.
Validity & Reliability
Talk aloud is widely used in HCI, however, in music performance, it can be disruptive to the
flow of interaction (Stowell et al. 2009). Therefore every effort was made to talk only between
interactions and not while the user was interacting, so as not to disturb the test too much. It is
understood that during such observational procedures there can be slight inaccuracies in data.
Due to the Hawthorne Effect, users may change their behaviour if they know they are being
monitored (Interaction Design Foundation 2018). However, the test was conducted in an
informal and non-procedural fashion, where users were asked to comment on issues as they
arose. It is believed that this relaxed atmosphere encouraged users to behave more naturally,
therefore providing more trustworthy data.
CHAPTER 4: METHODOLOGY
34
4.4. Data Analysis
4.4.1. Discourse Analysis
As previously outlined, much of the empirical data collected during this study was through
qualitative methods such as conversations and interviews. The majority of user feedback was
transcribed from clear and comprehensible responses to questions, which were scrutinised
relatively easily and incorporated into future design decisions. However, as these tests were
predominantly informal, much of the language flowed naturally and in a conversational style.
Accordingly, slang was often used, or sentences were left unfinished, meaning that some data
was lost. In these cases, discourse analysis provided the deeper level of scrutiny that was
required to probe for adequate insight. As Stowell has outlined, the role of discourse analysis
in evaluation DMIs is essential and can be summarised as follows:
When evaluating musical interfaces, discourse analysis can be applied as a structured approach
to uncovering connections and hidden meanings in interviews, while maintaining the integrity
of the original text. (Stowell et al. 2008).
4.4.2. Video Analysis
Studies such as those by Hornecker et al. (2008) and Xambo et al. (2013) have demonstrated
that video recordings can be an excellent means of evaluating TUIs. Xambo (ibid) cites Jordan’s
(1996) observation that video can also reveal hidden information in user actions that are defied
by their words:
One situation for which video provides optimal data is when we are interested in what ‘really'
happened rather than in particular accounts of what happened, including people's recollections
and opinions. (ibid, p.35).
As this thesis concerns a music performance system that involves subtle gestures and an
unorthodox means of control, video recording provided a way of capturing insights that would
be missed in the analysis of discourse alone.
CHAPTER 4: METHODOLOGY
35
4.5. Reliability, Validity, Generalizability and Limitations
As this section has underlined, data was often gathered through methods that can be regarded
as somewhat unstructured and unorthodox (e.g. posts to social media). However, in the interest
of keeping the creative process in motion through a ‘bias toward action’ approach, feedback
was obtained as often as possible and by whatever means possible. The research was
undertaken with the viewpoint that any feedback, however informal, was of some value to the
design process. As the nature of exploratory prototyping is to continually produce new and
sometimes unpredictable designs, an agile and varied approach to data collection as such
proved to be well equipped to tackle the ever-changing design directions that transpired
throughout this thesis.
CHAPTER 5: EXPLORATORY DESIGN
36
5. CHAPTER 5: EXPLORATORY DESIGN
5.1: Phase 1 of Design
5.1. Introduction
This chapter serves to document phase 1 of the design process which is primarily concerned
with the generation of design ideas. As a 'sketching in hardware' approach is adhered to, these
ideas take both conceptual and physical form. The chapter documents how an initial brute force
attack on Ableton explores where opportunities for tangible interaction lie beyond traditional
control mechanisms. This phase culminates in the assertion that NFC/RFID technology is a
solid basis for further design, leading to a second prototyping phase that is concerned with
augmenting and refining this technology to suit a live music performance setup (outlined in
CHAPTER 6: ITERATIVE DESIGN).
5.2. Ideation
5.2.1. Criteria to Guide Design
During this ideation phase, in order to better guide design, it was first necessary to lay down
several core design requirements for the envisaged TUI. These requirements were distilled
CHAPTER 5: EXPLORATORY DESIGN
37
from references to relevant literature, relevant projects and empirical research. They are
outlined as follows:
1. Design should focus on a system that is performable as an instrument - one that is both
engaging for the performer to use and visually engaging to the audience.
2. Design should be centred on interactions that involve physically moving graspable
objects around the performance space. There should be minimal need for buttons, knobs
or sliders.
3. Design should focus on eliminating the need to reference a computer screen. Users
should be able to gauge the state of the system immediately by merely visually referring
to the tangible objects before them.
5.2.2. Brainstorming & Bodystorming
The next step was to brainstorm possibilities for tangible interaction through active engagement
with Ableton itself. In order to provoke ideas more broadly, all aspects of interaction were
considered, including non-performance based administrative tasks. In several sessions of active
use with traditional control mechanisms (i.e. a mouse, a keyboard and a MIDI controller 5.2),
careful attention was paid to specific tasks in the act of ‘doing'. A first-hand critique of these
input mechanisms provided a better understanding of which aspects were open to being
augmented via a TUI.
5.2: Brainstorming setup: Using traditional controls as a basis for ideation
CHAPTER 5: EXPLORATORY DESIGN
38
5.3: Brainstorming documentation
Body storming was advantageous in attempting to understand the kind of objects that might be
suitable for tangible interaction with Ableton. This technique entailed playful experimentation
with shapes of different textures and sizes.
5.4: Bodystorming with various graspable shapes
5.2.3. Mind Mapping
Ideas from the brainstorming phase were grouped more logically into a mind map. Grouping
related functions into coherent categories increased the overall understanding of where
different forms of tangible interaction could be implemented to control different tasks.
CHAPTER 5: EXPLORATORY DESIGN
39
5.5: Mindmap
5.2.4. Sketching
Sketching played a significant part in design ideation. Written information documented through
brainstorming and mind mapping was subsequently given visual representation as sketches.
Firstly, these communicated clearer design thinking to the author, and secondly they articulated
design ideas more clearly to others in such a way as to provoke more accurate feedback. Early
sketches were imaginative and open in order to encourage more thought-provoking discussion.
As ideas were validated and refined, accordingly, sketches became much more detailed.
5.2.5. Storyboarding
Storyboards helped to conceptualise various use cases in a system based on tangible
connections to digital music. These were more focused on the actual user experience rather
than artefact design or technology. They served primarily to visually communicate concepts of
the proposed system to test candidates.
The below storyboard (5.6) explores how to create meaningful associations between objects
and sounds. The persona of a performing field recordist was chosen as an example of a typical
user who might potentially make use of this system. Here the field recordist digitally records a
seaside soundscape that will later be played in a live Ableton performance. He then takes a
CHAPTER 5: EXPLORATORY DESIGN
40
memento as a reminder of the environment in which that audio was recorded. It was envisaged
that this object would then serve as the tangible trigger for the sounding of the recorded ocean
soundscape within his performance.
5.6: Storyboard depicting the potential attachment of sounds to tangible objects
The below storyboard then depicts a crowds interpretation of that same performance, in
comparison to a traditional laptop-based performance. The scene on the right depicts a unique
scenario where a table is laid out with a visible array of curious objects - a scenario that might
inspire much more inquisitiveness and engagement from the audience. On the other hand, the
performer on the left is heavily focused on the computer, depicting it as a barrier between his
audience and him.
5.7: Storyboard depicting two separate performance scenarios: Closed laptop vs the open plan TUI
5.3. Parallel Prototyping Stage
CHAPTER 5: EXPLORATORY DESIGN
41
5.3.1. Introduction
This section outlines the first phase of design, which is primarily concerned with the building
of rapid, thought-provoking prototypes that are intended to explore the design space
imaginatively and encourage discussion. The section documents the progression in design
thinking, beginning with more technologically focused remediation of existing design as the
means of ideation, which leads to more task-specific design.
As already outlined, phase 1 of prototyping was conducted through a 'sketching in hardware'
approach where design ideas were rapidly explored. An iterative prototyping approach at this
stage would limit design ideas by focusing on a single solution (Nielsen 2011). Therefore, a
parallel prototyping approach was followed where several alternative design ideas were tackled
concurrently. Once several evaluations had established a promising route forward, only then
began an iterative phase of prototyping which was focused on exploring and improving a single
solution narrowly (5.8).
5.8: Parallel and iterative phases
5.3.2. Parallel Prototype #1: Exploring Tech Focused Design
Even though several promising design concepts had been laid out in the ideation phase, it
seemed that the generation of more imaginative design possibilities was inhibited due to a lack
of understanding of enabling technologies. For this reason, broadly probing technologies used
in past successful TUIs was considered beneficial. A crude working prototype was constructed,
and it was through playful experimentation with the technology itself that further design
concepts were explored.
CHAPTER 5: EXPLORATORY DESIGN
42
5.3.2.1. Motivation
The Reactable system is inspirational to the research of this thesis for several reasons. It
satisfies the research questions of this thesis as it facilitates richly embodied and expressive
musical performance for the user, while at the same time it opens up the performance to the
audience in that the performer's actions are overtly visible. Furthermore, the system is open
sourced and ‘hackable'. For these reasons, its technological framework was deemed to be
promising ground for exploration.
5.3.2.2. Description
An Instructables article (Convivial Studio 2017) revealed how to hack an iPad8 touchscreen
which enables it to host a crude version of the Reactable system. Example code was provided
as well as instructions on how to construct homemade tangibles that could be recognised by
the touchscreen surface. The touchscreen establishes the XY coordinate and rotation of each
tangible via three strategically placed conductive touch points on each. These points form an
isosceles triangle, allowing each tangible to be recognised independently through its unique
apex angle. The code was written using Openframeworks9 (C++) and subsequently transferred
to iPad as an app.
8 https://www.apple.com/ie/ipad/
9 https://openframeworks.cc/
CHAPTER 5: EXPLORATORY DESIGN
43
5.9: App is coded on Macbook and transferred over WIFI to be hosted on iPad
5.3.2.2.1. Constructing the Tangibles
Any graspable object could serve as a tangible and a pdf printout was supplied to guide touch
point placement on each. The touch points were constructed from three identically sized foam
circles. Trails of conductive ink were made between each touch point and the top side of the
tangible. The ink ensured contact was made between the fingers and touch points underneath.
Without this capacitive connection, the system would not function.
5.10: The printout that was supplied as a guide for touch point placement
As the iPad screen is quite spatially restrictive, small-scale tangibles were first constructed.
However, their compactness meant that the apex angles were not accurately recognised and
their flimsiness made them difficult to grasp and manoeuvre.
CHAPTER 5: EXPLORATORY DESIGN
44
5.11: Phases of construction of smaller cardboard tangibles
Larger tangibles in the form of bound stacks of poker chips proved to be much easier to
manipulate. The added weight also meant that they made better contact with the screen.
5.12: Larger tangibles were more graspable and stable
5.3.2.2.2. Proof of Concept: Max MSP Communication
By augmenting the C++ code, the XY coordinates and rotational angle values of a tangible was
transmitted to Max MSP using Open Sound control10 (OSC) protocol. As a proof of concept, a
step sequencer in Max MSP that allowed two tangibles to act autonomously when used together
was implemented. The Openframeworks app was reprogrammed to divide the iPad screen into
10 http://opensoundcontrol.org/introduction-osc
CHAPTER 5: EXPLORATORY DESIGN
45
four sections, each representing one step in a four-step sequence. When a tangible is placed
upon a square on the iPad screen, it would have autonomous control over that step. For visual
feedback, as each step is triggered, its corresponding square illuminates. The rotation angle of
tangible number one was given control over the pitch of whatever step it was placed upon,
while tangible two was given control of filter cutoff. (A video of sequencer in action can be
found here: https://vimeo.com/263769511).
5.13: Image of the app interacting with Max (hosted on the computer). The blue tangible controls the
pitch of the note occurring on each step. Turning the tangible clockwise increases the pitch.
5.3.2.2.3. Potential use with Ableton Live (Via Max MSP)
While the Max MSP sequencer was proof that OSC could be effectively transmitted, attention
now turned towards how it might be implemented with Ableton. This issue was first explored
through sketching. The below sketch (5.14) depicts how tangibles could be implemented as
meaningful representations of virtual effect units with Ableton. In this system, the tangibles
serve to both represent and control presets – in this case, two reverberation presets. The can of
beans on the left represents the “My Singing Can” preset (circled red), while the house shaped
tangible represents the “My Cathedral” preset (circled blue). The dotted line in the sketch
depicts how the iPad screen could be divided to create a boundary within which each tangible
has autonomous control over its associated preset.
CHAPTER 5: EXPLORATORY DESIGN
46
5.14: Sketch depicting how two tangibles can control two separate Ableton effects
5.3.2.2.4. Technical Implementation of Sketch
The sketched concept was realised by placing fiducial markers on the bottom of two tangibles
- a tin can and a Lego representation of a Cathedral (5.15). In this fashion, the physical
movement of the tangibles could be mapped to plugin parameters within Ableton over OSC.
5.15: Tin can and Lego cathedral tangibles (with fiducial markers attached) on iPad screen
A max patch was then designed to act as an intermediary for OSC messages between the iPad
app and Ableton. The rotational angle and XY position of both tangibles were now
CHAPTER 5: EXPLORATORY DESIGN
47
independently assignable to any MIDI parameter. These were mapped to two reverb presets as
depicted in 5.14.
5.16: Control flow from iPad to Max MSP (left) and Max MSP patch (right)
5.3.2.3. Evaluation
The system was demonstrated in two presentations. These presentations also demonstrated the
research behind it, ensuring that feedback related not just to the novelty of interaction, but to
the overall concept. After this, several friends and colleagues from varying musical
backgrounds were invited to interact with the prototype in an informal setting. While all users
appreciated the novel aspect of the overall concept, most of the feedback was highly critical,
particularly on technological aspects. Some of the user comments are as follows:
• There was not enough room to move the bulky tangibles around the small iPad screen.
• The capacitive connection continually broke (it was observed that users would either
forget to touch the conductive strip of paint or place too much pressure on one side of
the tangible causing a connection point to become unattached from the screen)
• Movement was jerky and sometimes unresponsive, so there was a constant need to refer
to Ableton’s GUI to see if mapped parameters were visually changing.
• Users were nervous of scratching the screen with the tangibles.
5.3.2.4. Conclusion
Aside from its technical faults, there were several key reasons not to pursue this line of inquiry
any further. While this prototype facilitated tangible interaction to a degree, the remote control
CHAPTER 5: EXPLORATORY DESIGN
48
aspect meant that, in reality, the system was merely one small screen with no GUI being used
as the control device for information located on the GUI of another larger screen. It seemed
that Reactable was comparatively much more effective as a TUI because, if a tangible is moved,
animations move in correspondence, so there is a much more of a direct coupling between the
tangibles and the underlying digital information.
The difficulty and complexity of pursuing this endeavour in return for limited results prompted
a shift in focus. It was decided to instead look towards more basic technologies to solve tasks
in a case by case fashion rather than searching for an overlying umbrella technology on which
to base the entire system.
5.3.3. Playful Task Focused Design
After more extensive technologies had been explored and had failed to produce significant
results, it was thought that a shift in focus towards smaller tasks might yield fresh design
inspiration. Therefore, the focus switched to designing for Ableton at a micro level based on
requirements outlined in the brainstorming phase. This phase was more task focused, with the
technology employed being a secondary consideration. The technology was kept deliberately
simple so as to facilitate rapid prototyping and feedback.
5.3.3.1. Parallel Prototype #2: Bringing Tempo Under Tangible Control
5.3.3.1.1. Motivation
At the beginning of this task focused design phase, there was a deliberate attempt to produce
designs that were both thought-provoking and playful. For this reason, Ableton's tempo was
chosen as an interesting starting point. In general, the concept of time is a highly abstract one,
so it was thought that bringing an inherently intangible aspect such as tempo under tangible
control would provoke discussion into other imaginative design ideas.
CHAPTER 5: EXPLORATORY DESIGN
49
5.3.3.1.2. Description
The prototype was initially conceived through a series of ‘brute force attack’ sketches that
broadly tackled the subject of tangible control for Ableton. It was considered that a physical
wind-up metronome could take the place of Ableton’s built-in digital metronome.
5.17: Sketch depicting a brainstorming tangible control over Ableton. (the metronome is circled)
A magnet was attached to the swinging pendulum of a physical metronome, and a hall sensor
to its body. In this fashion, the swing rate of the pendulum could be measured by the frequency
at which the magnet passed the hall sensor. The hall sensor, through its Arduino Uno
connection, sends a value of ‘1’ each time the magnet passes and a ‘0’ otherwise. The control
was mapped to Ableton Live’s ‘tap’ metronome function in via a crude Max MSP patch that
trigger the ‘M’ keyboard key (via aka.keyboard11), which was, in turn, mapped to trigger
Ableton’s metronome12. In this fashion, the tempo within Ableton was set in accordance with
the frequency of pendulum swings.
11 http://www.iamas.ac.jp/~aka/max/#aka_keyboard
12. This workaround was required as the Arduino UNO in use at the time was not capable of sending MIDI
messages.
CHAPTER 5: EXPLORATORY DESIGN
50
5.18: Metronome: Hall sensor attached to the body and magnet attached to the pendulum
5.3.3.1.2.1. Evaluation
Volunteers were invited to interact and offer feedback. However, it was found that feedback
was minimal. Those familiar with design noted that the technology was neatly applied to
complete the task at hand creatively. Others could not understand why it had been designed in
the first place as, in light of the overall research question, this did not seem usable in
performance. What is more, it was loud and annoying13 and so it actually masked the music
emanating from Ableton itself.
5.3.3.1.2.2. Conclusion
Due to apparent confusion about its role, it became clear that this design could not be
adequately gauged via user feedback. After all, it was not designed with practical application
in mind, but merely as a first explorative foray into creatively solving a particular task relating
to tangible interaction with Ableton. It proved valuable, if for no other reason that, in the
process of building it, several workarounds to trigger MIDI events within Ableton were
discovered that benefitted later designs.
13 The sound producing element is key to this metronomes functionality and cannot not be removed
CHAPTER 5: EXPLORATORY DESIGN
51
5.3.3.2. Parallel Prototype #3: Sample Selection by Weight
5.3.3.2.1. Motivation
As with the hacked metronome, this design served as an opportunity to accomplish a small
scale task using crude sensor technology. In this case, the selection of samples was the task at
hand. The design solution was achieved in reaction to the conceptual metaphors and physical-
to-abstract mapping outlined in the research of both Macaranas et al. (2012) and Hurtienne et
al. (2009), while the physical set up was directly inspired by Carvey et al. (2006).
A physical characteristic that is key to human perception of tangibility is weight, and weight
can also be used as a rudimentary control for computer-mediated tasks. The below sketch
envisages that a variety of weights can be placed upon a weighing scales, and the resulting
values can then be used as the control system to swap kick drum samples in and out of
Ableton’s Sampler instrument. The heavier the weight placed upon the scales, then the
"weightier" the character of the resultant chosen sample. Like many of this thesis' proposed
controllers, this plays heavily on the interplay of metaphor between physical objects and
musical phenomena. It was considered that this system might form a feasible theatrical
component within live performance.
5.19: Sketch depicting use of weights to control Ableton
CHAPTER 5: EXPLORATORY DESIGN
52
5.3.3.2.2. Description
With the low fidelity sketch (5.19) as a reference, a proof of concept of the system was built
using a load cell14, an HX71115 amplifier and an Arduino Uno (5.20).
5.20: Load cell, amplifier and Arduino
First, the load cell required calibration using a known weight. This calibration was done using
a 500ml bottle of water as an approximation of 500g (5.21).
5.21: 500ml bottle of water used to calibrate the load cell
14 https://learn.sparkfun.com/tutorials/getting-started-with-load-cells
15 https://learn.sparkfun.com/tutorials/load-cell-amplifier-hx711-breakout-hookup-guide
CHAPTER 5: EXPLORATORY DESIGN
53
With the load cell now able to approximate weight, the next step was to choose two weights
and map these so that they triggered two classic kick drum samples within Ableton. For this,
two rolls of tape which were measured to be 100g and 500g were used (5.22).
5.22: Two tape rolls used as weights
Knowing that 500g was the maximum value required for this test, within the Arduino code the
range 0-500 was scaled to be within the range of 0-127 - a MIDI continuous controller (cc)
value scale which was much more usable within Ableton. This scaled output was mapped to
the sample selection parameter of Ableton’s Sampler instrument, where the kick samples as
mentioned above were preloaded. By adjusting the sample selection range within Ableton, the
100g weight was set up to trigger the "lighter" ‘tr707' kick sample and the 500g weight to
trigger the "heavier" ‘tr909' kick sample. To ensure that no sound would trigger when no weight
was applied, a gap in the range from 0-20 was left (5.23), so a sample would only trigger if the
applied weight went above 20 MIDI cc values.
5.23: Setting separate selection ranges for both kicks. The range from 0-20 (circled) is where no sample
would be triggered when there is no weight applied to the load cell
CHAPTER 5: EXPLORATORY DESIGN
54
A MIDI clip with a looping 4/4 beat was played, which triggered the sampler. It was observed
that both weights triggered their associated kick samples when the correct weight was applied.
However, the load cell took some time to reach a steady state when changing weights, meaning
samples changed only after a noticeable delay. (A video of the setup can be found here:
https://vimeo.com/285528787)
5.24: 100g weight triggering the lighter kick drum and 500g triggering the heavier one
5.3.3.2.3. Evaluation
To facilitate rapid feedback, the prototype was demonstrated informally to three individuals
(5.25), two of whom had substantial experience with Ableton. They were asked to comment
on the author's intended use (sample selection) and to offer their personal interpretations of
alternative scenarios in which the system could be applied musically. All agreed that it was
novel to use graspable physical objects to change music parameters, but for the most part, users
seemed to have the impression that this system was a little too gimmicky. However, one user
(an Ableton user) saw value in the system from a performance context. This user saw potential
in the sample system from a theatrical point of view, but also noted that the system had potential
to control the intensity of effects such as distortion, which had an apparent metaphorical
connection to weight - from gentle to heavy.
CHAPTER 5: EXPLORATORY DESIGN
55
5.25: User evaluates Ableton weight controller
5.3.3.2.4. Conclusion
Practically speaking, this prototype served as a basic proof of concept for the use of tangible
objects to change musical parameters in real time within Ableton. However, the evaluation did
not suggest any unique purpose for the system beyond the proposed. It was considered that
perhaps the over-reliance on weight was a limiting factor in that object choice is based on an
overly finite set of criteria. This prototype directed research towards exploring whether a
system that facilitates broader and more meaningful associations to objects might lend itself
better to TUI design for Ableton.
5.3.3.3. Parallel Prototype #4: Tangible Ableton Presets
5.3.3.3.1. Motivation
Where weight was perceived to be somewhat limiting in attaching meaning to objects,
contactless technologies such as Radio-frequency identification (RFID) and Near-field
communication (NFC) allow for much more specificity. These provide a straightforward
means of identifying objects based on short-range wireless technologies. A simple tag can be
affixed to an object, and its unique identifier code (UID) identified as it passes close to a reader.
This code can then be stored and assigned to perform specific computational tasks each time it
is read and re-read. The investigation into this area was primarily influenced by Skal (Arnall
CHAPTER 5: EXPLORATORY DESIGN
56
and Martinussen 2010), a media player that plays media clips based on their association with
real-world objects.
5.26: Sketch exploring preset selection based on touch and size
5.3.3.3.2. Description
Sketching helped to conceptualise how objects might be associated to music parameters, not
just by obvious meaningful connections, but by other aspects such as texture and size. In the
left of the above sketch (5.26), a series of objects with contrasting textures are explored as a
means of control over e.g. presets for Ableton's distortion effect. The smooth object would
trigger a gentler distortion setting, while the rough or spiky ones would trigger much harsher
settings. In this way, a kind of a kind of synaesthetic connection might be made between the
feel of an object and the character of a sound.
One shortcoming of RFID/NFC technology with respect to music is that it involves simple
on/off controls, with no capacity for gradual change. A workaround for this is explored above
right (5.26) which proposes that the user can choose from different sized objects of the same
category to trigger different magnitudes of a single effect. In this instance, an ever more
expansive reverberation preset is chosen based on the size of different objects of the same
category, from a cramped bathroom to a vast hall.
CHAPTER 5: EXPLORATORY DESIGN
57
5.27: PN532 NFC module
A PN53216 NFC module (5.27) was connected to an Arduino Uno which was in turn connected
to a laptop via serial USB. There is no MIDI control over the preset swap function within
Ableton, therefore, once an NFC tag was read by the reader, a Max MSP patch was employed
to trigger specific keyboard commands (via the aka.keyboard library for Max MSP). A program
called Keyboard Maestro17 then detected these keyboard commands, which was preconfigured
to open specific presets within Ableton (5.28).
5.28: Depiction of information flow in overly convoluted setup
Two types of inbuilt Ableton reverb presets were associated with real-world objects based on
meaningful association: the ‘singing can' preset with a physical tin can and the ‘cathedral’
preset with a Lego representation of a church. An NFC tag affixed to the bottom of each meant
that when either object is placed on the reader, its corresponding preset was triggered (5.30).
(A video of the prototype can be found here: https://vimeo.com/285525081).
16 https://cdn-shop.adafruit.com/datasheets/pn532ds.pdf
17 https://www.keyboardmaestro.com/main/
CHAPTER 5: EXPLORATORY DESIGN
58
5.29: Tangible objects are stored symbolically on a shelf to demonstrate a real world alternate to digitally
storing presets in folders.
5.30: Placing a can atop the PN532 switches to the ‘singing can’ Ableton reverb preset
5.3.3.3.3. Evaluation
Two volunteers were invited to interact with the prototype briefly. Both had prior experience
with Ableton Live. Again, they were asked to comment on the current means of control and to
recommend alternative uses. It was immediately clear that this prototype left a much stronger
impression than anything else had done previously, as feedback was extremely positive.
From the outset, users noted that in comparison to the iPad setup, for example (which both had
already used), the functionality was much simpler to understand. Another positive, they said,
was an instant and apparent correlation between their actions and musical changes. The only
criticism was that the interaction was overly simplistic and inexpressive (relating to the
CHAPTER 5: EXPLORATORY DESIGN
59
aforementioned on/off functionally). A supplementary control of some sort was broadly
recommended as a means of attaining the desired expressive element.
5.3.3.3.4. Conclusion
The functionality of this prototype was quite simplistic, yet it was observed that had the most
profound impact to date. The concept that a simple sticker could be used to transform virtually
any object into a bespoke music control device seemed to resonate well with those who offered
feedback. Though this particular prototype applied the system in a highly convoluted fashion,
it served well as proof of concept that Ableton parameters could be changed via RF signal with
a minimal delay. This immediacy seemed particularly promising for musical performance,
where minimising latency between action and effect is vital (Jack et al. 2016). Therefore it was
decided that short-range wireless technologies would fundamental to design from this point
forth.
5.3.3.4. Conclusion of Task-Specific Design Phase
The move towards task-specific exploratory prototypes proved to be much better suited in
tackling the problems at hand. When compared to expending effort on a complex system that
ultimately failed, there were clear advantages in leaning towards more basic technology that
was both rapidly built and rapidly evaluated. Furthermore, each of these smaller scale
prototypes required an entirely different approach to ‘hacking’ Ableton’s MIDI system, and
therefore these inspired more open ideas on how the program could be hacked overall.
5.4. Exploratory Design: Analysis & Conclusion
Each of the technological explorations outlined in this chapter is presented below (5.31) in the
form of an annotated portfolio (Gaver and Bowers 2012) as a means of ‘reflection on action’
(Schön 1984). These annotations analytically and critically convey the most prominent
characteristics of each prototype that were learned through the physical act of making. The
relevance of each annotation to each prototype is dependent on proximity, and several
prototypes may share a single annotation.
CHAPTER 5: EXPLORATORY DESIGN
60
This depiction serves to communicate the individual significance of each prototype to the
overall design process and demonstrates that, while all but one of these technologies were
subsequently abandoned, none are regarded as failures. Instead, each has brought new learning
that has significantly contributed to this research.
5.31: Annotated Portfolio of exploratory prototyping phase
CHAPTER 6: ITERATIVE DESIGN
61
6. CHAPTER 6: ITERATIVE DESIGN
6.1. Introduction
The previous chapter concluded with the establishment of short-range wireless technology as a
viable means of control, as it permits virtually any object to be used as a tangible control
mechanism. With a fundamental technological groundwork established, this chapter focuses on
probing this technology further in order to establish how it might be modified to suit music
performance specific tasks within Ableton. These tasks were set out in accordance with three core
activities that are typically undertaken during a live performance with Ableton:
1. Changing sounds on the fly.
2. Changing clips on the fly.
3. Live mixing.
6.1: Mind map with three key Ableton live performance tasks circled
Whereas the previous chapter was concerned with the realisation of several radically different
designs, this phase focuses on prototyping that is more iterative in nature.
CHAPTER 6: ITERATIVE DESIGN
62
For this and subsequent prototypes, the PN532 NFC reader was substituted for the RC522 RFID
reader18 (6.2). This change in hardware was due to its lower cost and more straightforward
functionality. Furthermore, the Arduino Uno was also replaced with the Arduino Pro Micro19 for
two reasons: (i) its smaller profile would be easier to install within an enclosure and (ii) its
microchip is equipped with MIDI capabilities, eliminating the need for intermediary programming
in Max MSP.
6.2: Arduino Pro Micro and RC522 reader with assorted RFID tags (fobs and cards)
6.3: Depiction of how MIDI transmitted over serial negates the need for Max MSP
6.2. Task-Focused Design
This section details the designing of two modular TUI prototypes; one that is employed to change
sounds on the fly and the other clips. The section also details informal user evaluation that was
conducted on both prototypes which followed initial feedback via crowdsourcing.
18 https://playground.arduino.cc/Learning/MFRC522
19 https://www.sparkfun.com/products/12640
CHAPTER 6: ITERATIVE DESIGN
63
6.2.1. Iterative Prototype #1: Tangible Clip Selection
6.2.1.1. Motivation
The next step was to explore if radio frequency technology could be used effectively to trigger
Ableton's clips. This task was an abstract one that proved much more difficult to associate with
objects meaningfully. Therefore the design space was explored in highly playful fashion. Due to
their association with a sense of play, a pair of dice seemed to be quite fitting for this particular
task. What is more, dice are also symbols of chance and risk, and thus they may lend themselves
well to improvisation and spontaneity; two critical characteristics in the perception of liveness in
any music performance (Cooke 2011).
6.2.1.2. Description
6.2.1.2.1. Concept
Sketching was carried out while bodystorming with real playing dice (6.4). Quite quickly it was
understood that the numbers of the dice could dictate the duration of clips; for example, if side
three of a die were selected, then a clip would loop over after a duration of three 16th note steps.
If two dice were used to control the clip length of two separate tracks in this fashion, musically
interesting polyrhythms could be brought about.
6.4: Playing with dice to get a feel for their capabilities (left) and Sketching ideas (right)
CHAPTER 6: ITERATIVE DESIGN
64
6.2.1.2.2. Technical Implementation
A pair of sizeable graspable foam dice were used for this experiment, with three RFID tags affixed
to each one (6.5). The Arduino was programmed so that the RC522 would read these tags and
send unique MIDI values to Ableton depending on which tag was read. The code was uploaded
to two separate Arduinos so that both dies could work in unison (6.7).
6.5: RFID tags attached to individual sides of the dice
Within Ableton, each die was designated a separate track to control. Each track was preloaded
with a different percussive instrument sample. Three clips were placed on each track (6.6), the
length of the clips consisting of either three, four or five sixteenth note steps20.
Each die would trigger one of three clips on its corresponding track, depending on which side
faced upwards when the die was placed upon the reader.
6.6: Each die would trigger one of three clips on it assigned track, depending on what side was being read
Clips were triggered by the correct corresponding face of a die using Ableton's MIDI mapping21
function. Once ‘Midi map mode' was enabled and the desired clip selected, all that was necessary
20 These numbers were sufficient to create unique rhythms between the two dice.
21 https://help.ableton.com/hc/en-us/articles/360000038859-Creating-MIDI-Mappings
CHAPTER 6: ITERATIVE DESIGN
65
to link both was to place the corresponding face of the die on the reader. This process was repeated
for all three tagged sides of each die.
6.7: Two RFID readers were required to read each dice. Each paired reader and dice was given a separate
track to act upon within Ableton.
6.2.1.3. Tentative Evaluation
This prototype was a breakthrough, as it was the first that was in any way musically performable.
After personal experimentation with the system, it was found to be quite musically enjoyable to
use. In order to receive rapid feedback from a third party, a 1-minute video was filmed of the
author performing with the dice, which was then shared to an Irish Ableton focused user group on
Facebook (6.8). There it received extremely positive feedback via comments and direct messages,
so it was therefore deemed suitable for further user evaluation. (A video of the prototype can be
found here: https://vimeo.com/285523995).
CHAPTER 6: ITERATIVE DESIGN
66
6.8: Facebook video post #1 with some example feedback via comments
6.2.2. Iterative Prototype #2: Tangible Sound Selection
6.2.2.1. Introduction
The next task was to explore how sounds could be changed on the fly within Ableton via tangible
control. Sketching served as an ideal means of exploring how objects could be meaningfully
associated with digital samples. In the below scenario (6.9) the very obvious ‘moo’ sound
associated with a cow and ‘vroom’ sound associated with an automobile are given as a simplistic
example. These sounds are triggered when the object (with a pre-tagged RFID sticker attached) is
passed close to the RFID reader.
CHAPTER 6: ITERATIVE DESIGN
67
6.9: Sketch that explores assigning samples to associated objects
6.2.2.2. Description
6.2.2.2.1. Associating Sounds with Objects
A variety of household receptacles; a cup, a glass and a tin, were chosen as tangible sound objects.
Using a portable recorder, the sound of each receptacle being struck with a spoon was captured
(6.10).
6.10: Recording the striking of each object with a spoon using a portable recorder
So as to ensure no delay would occur when triggering these samples live, each of these sound files
was then edited so that it triggered right at the point of striking. The three edited samples were
then imported into Ableton’s sampler, where they were each set unique MIDI ranges that allowed
them to be individually triggered from an external MIDI device. In the Arduino code, three RFID
CHAPTER 6: ITERATIVE DESIGN
68
cards22 were each assigned to trigger a unique MIDI cc number (6.11) corresponding to each of
the three sample trigger ranges in Ableton.
6.11: Arduino code showing unique MIDI cc values being transmitted for each of the three RFID cards
These cc values were sent over the unused modulation wheel MIDI channel (channel 1) which
was then mapped to control sample selection of the sampler (6.12).
6.12: Mod wheel mapped to sample selection (left) and MIDI trigger range for each sample set (right)
Each RFID card was then affixed to the bottom of its associated object (6.13) so that when placed
on the reader it would trigger the corresponding recording of that object being struck.
22 RFID cards were used as RFID stickers had not been acquired at the time
CHAPTER 6: ITERATIVE DESIGN
69
6.13: Attaching RFID cards to base of objects (in the absence of stickers)
6.14: Swapping the objects in and out over the sensor triggers the different sounds
6.2.2.3. Tentative Evaluation
In isolation, the sample changing system was observed to be slightly uninspiring, so it was
combined with the previous dice clip changing system, so as to make it more musically
performable. A brief percussive rhythmic performance was filmed and the concept was again
shared on social media in order to obtain quick feedback (6.15). Once again the reaction was
extremely positive, so this was again taken as a sign that this prototype also warranted more
stringent user evaluation. (A video of the prototype can be found here:
https://vimeo.com/285523330).
CHAPTER 6: ITERATIVE DESIGN
70
6.15: Facebook video post #2 with feedback comments
6.2.3. User Evaluation of Polyrhythmic Dice and Tangible Sound Selection
Prototypes
6.2.3.1. Method and Directive
The polyrhythmic dice and sound selection prototypes were evaluated together in a single group
evaluation, during which users were asked to think aloud and discuss their impressions with one
another while interacting. To discover if these prototypes were broadly understandable to a variety
of users, candidates were chosen from a mixed musical background, only one of whom had any
considerable experience with Ableton.
As these prototypes had been developed quickly and had thus far only been evaluated by way of
Facebook posts, possible ways in which they could be used were not yet defined. For this reason,
CHAPTER 6: ITERATIVE DESIGN
71
Senger and Gaver’s (2006) method of leaving the mode of use up to the interpretation of the user
was adhered to with the aim of provoking unexpected insights. Accordingly, users were
deliberately given minimal direction. They were told the function of each component within the
system, but they were not told how to perform with those components. All of this served to observe
what users would naturally do without being prompted, and thus to best judge the most natural
method of performing with the system.
6.16: Sample triggering objects (cup, glass & tin) circled red and polyrhythmic dice circled blue
6.17: User A interacting with the tangible sample selection controller
CHAPTER 6: ITERATIVE DESIGN
72
6.18: Users B, C and D interact as a group with the system paying little attention to laptop screen
6.2.3.2. Required Data
The evaluation was set out with the purpose of gathering the following information:
• If the prototypes were engaging to use?
• If they were visually engaging from the perspective of the casual observer?
• For those that had used MIDI controllers, if these prototypes facilitate a more expressive
means of control?
• Suggestions for alternative uses?
In order to ascertain the above criteria, feedback was obtained via three methods:
• Insights from inter-user discussions.
• Observation.
• Post evaluation questioning.
6.2.3.3. Findings
Both systems were unanimously agreed to be engaging and fun, regardless of musical experience.
Those that had used MIDI controllers in the past noted that the physical movement of objects
encouraged more playfulness and exploration than traditional controls. Candidates generally
agreed that both systems created an intriguing spectacle to behold from an audience perspective.
The main benefits of this evaluation came in the form of several valid design recommendations.
These are outlined as follows:
CHAPTER 6: ITERATIVE DESIGN
73
• The more musical of the group noted that the prototypes, while engaging, lacked any
facility for musical expression.
• Users desired more variety in what the dice could do, such as stopping clips as opposed to
merely changing between them.
6.2.3.3.1. Personal Observations
Though these prototypes were not introduced as collaborative systems, users seemed to take turns
performing two at a time rather than one by one. Of particular interest, however, is that, though
Ableton was running on a visible screen beside the prototypes, users did not pay any attention to
the screen at all. On subsequent questioning, users stated that they were not interested in the screen
because the objects and interactions were so engaging. These insights were considered to be highly
promising in light of the research questions of this thesis.
6.3. Conclusion of Task Focused Design
In this phase, RFID technology was explored and expanded into two working modular prototypes.
Through user testing, both were deemed to hold considerable potential for tangible interaction.
However, tests also made it clear that both prototypes were much too unrefined to be in any way
musically performable; an issue that the next phase in design was tasked with remedying.
6.4. Performance Focused Design
6.4.1. Introduction
Feedback from the previously mentioned group evaluation provided several strong
recommendations for a redesign. This current phase of design incorporates this feedback to realise
a system that is more musically performable. As such, it explores design from both a technical
and artistic standpoint, and therefore input from more experienced Ableton musicians was
required during subsequent evaluation.
CHAPTER 6: ITERATIVE DESIGN
74
This phase explores two ways in which a TUI can be used to perform with Ableton. These are
outlined as follows:
#1. A ‘live set’ approach
• The ‘live set’ is the typical approach to live performance with Ableton, where sounds,
patterns and audio levels are adjusted on the fly. This prototype serves to explore an all in
one TUI performance system, complete with a degree of expressive control.
#2. A dj approach
• This secondary and somewhat more unusual design approaches Ableton as a DJ tool,
where tangible objects are used trigger songs. This prototype primarily serves to explore
ways in which meaning can be attached to tangible objects.
6.4.2. Performance Prototype #1: Tangible Live Set
6.4.2.1. Motivation
This system expands upon evaluation of the previous phase, where users noted that they would
like more overall control. Therefore, the facility for live mixing was added, which is a crucial
element of live electronic music performance. The overall goal of this prototype was to build a
dynamic system that brought many of the typical tasks in Ableton live performance under tangible
control, namely:
• Live mixing.
• Soloing and muting of individual tracks.
• The ability to change sounds on each track.
• Live triggering and changing of clips.
CHAPTER 6: ITERATIVE DESIGN
75
6.4.2.2. Description
6.4.2.2.1. Tangible Mixing
A facility for mixing the track levels was added in response to users calling for more expressive
control over the sound. An HC-SR04 distance sensor23 was added to the circuitry of each RFID
reader (6.19).
6.19: Circuit with PING sensor now added
6.20: Calibrating the distance sensor using a ruler
These would measure the distance between the reader and a solid object in the distance, and this
value would then be scaled and used as a MIDI cc input within Ableton. In this way, the proximity
of the reader to an objects dictates how loud the volume of its corresponding track will be (6.21).
In true TUI fashion, this would allow the objects themselves to dictate the volume at which their
corresponding track plays24.
23 https://components101.com/ultrasonic-sensor-working-pinout-datasheet
24 However, the object and the reader must both be moved in unison for this to occur.
CHAPTER 6: ITERATIVE DESIGN
76
6.21: Distance dictates sound volume of each Ableton track
6.4.2.2.2. Multi-Sample Player Per Track
For this task, RFID tags were affixed to miniature graspable versions of musical instruments; a
kick drum, snare drum, shaker and a glockenspiel (6.22).
6.22: Tangible toy instruments with affixed RFID tags are used to trigger sounds
Three RFID readers were each given control over the sounds playing on three separate Ableton
tracks. The instrument that would play on each of the three tracks was dictated by whatever real-
world instrument was placed onto its corresponding reader (6.23). Any of the three readers could
read any tagged instrument, so each track could also play the sound of any of the four instruments.
CHAPTER 6: ITERATIVE DESIGN
77
6.23: Each reader given control over a track within Ableton.
6.4.2.2.3. Variation in Dice Control
In response to user feedback in the previous phase, the dice were assigned extra functionality.
They were given the functionality not only to change clips, but also to solo or mute their
corresponding track (6.24).
6.24: Dice given extra functionality: solo ("S") and mute ("M")
Each of the three readers was coded so that it could recognise tags on both the dice and the toy
instruments (6.25). In this way both the sounds and patterns were now changeable using only a
single reader per track, rendering the system more suited to testing in a performance-focused
scenario.
CHAPTER 6: ITERATIVE DESIGN
78
6.25: Each reader can now read both the tangible instruments and the dice
6.4.3. Performance Prototype #2: Tangible Djing
6.4.3.1. Motivation
This prototype was devised to provoke discussion on how to associate objects with sounds. A
system of ‘tangible djing’ which assigned songs to personal objects on the basis of personal
connections was intended to gather feedback on how others might associate meaning to digital
media.
6.4.3.2. Description
This prototype made use of a single RFID reader. In the same fashion that the dice changed clips,
personalised objects were instead used changed between whole songs. Ten or so graspable objects
were chosen to trigger these songs based on their association to those songs, e.g. a band or a song
name (6.26).
CHAPTER 6: ITERATIVE DESIGN
79
6.26 Tangible objects with RFID stickers attached used to trigger songs
Some associations were obvious; for example, a playing card would trigger ‘The Ace of Spades’
(Motörhead 1980), while others were made deliberately obscure, e.g. a driving glove would trigger
‘Night Call’(Kavinsky 2010); an association that would only be obvious to those who had seen
the movie ‘Drive’ (Winding Refn 2011) in which the track features.
In order to explore how different variations of the same object could be used theatrically in
performance, three Russian matryoshka dolls were used to trigger three versions of ‘Babooshka’
(Bush 1980) at different pitches; the smaller the doll, then the higher the pitch at which the song
played (6.27).
6.27: Matroyoshka Dolls (left). Tinfoil prevents inner dolls tags from being read (right)
6.4.3.3. Tentative Evaluation
In a subsequent video posted to Facebook (6.28) in pursuit of quick feedback, this element was
observed to be the most entertaining. The medium and small sized dolls were hidden within the
larger one until the end of the video, at which point they were introduced one by one to comical
effect by triggering ever more higher pitches of the song. (A video of performance can be viewed
at the following link: https://vimeo.com/285521580)
CHAPTER 6: ITERATIVE DESIGN
80
6.28: Facebook video post #3
6.4.4. Evaluation of Performance Focused Prototypes
6.4.4.1. Candidates
As this phase was focused on how to apply the system in performance, expert commentators were
deemed to be suitable test candidates. Therefore, two users who had extensive knowledge of
Ableton and who had performed using the platform in the past were chosen. These tests were also
video recorded (with both ethical consent and the user’s consent) as a means of documenting
musical interactions for subsequent scrutiny.
CHAPTER 6: ITERATIVE DESIGN
81
6.4.4.2. Technical Setup
6.29: Technical setup for user evaluations
The technical setup consisted of the following:
• A laptop running Ableton with three tracks: each containing several clips.
• 2x RFID readers with distance sensors for live mixing.
• 1x RFID reader without (the same as the above but without the facility to mix).
• 3x tagged dice for changing (i) clips in the live set and (ii) polyrhythms in the polyrhythmic
dice scenario.
• A set of tangibles to change sounds in the live set component.
• A set of tangibles to change songs in the djing component.
• A sketchbook displaying sketches that clarified specific concepts to the user.
• A camera and audio recorder to document the sessions.
6.4.4.3. Directive
Three elements were outlined for scrutiny from the outset, each of which was tested in turn:
• Live set:
⁃ This performance scenario was the central area of focus, with questions relating to
recommendations for more suitable control and better association of sounds with
CHAPTER 6: ITERATIVE DESIGN
82
objects. Within this setup the dice served to change between musical clips on each
track.
• Polyrhythmic dice:
⁃ Candidates were asked to perform with a revised and improved version of the
polyrhythmic dice prototype from evaluation one.
• Tangible djing:
⁃ This prototype was used to provoke opinions on alternative ways of triggering
sounds with objects.
Again users were encouraged to ‘talk-aloud' as they proceeded to interact with the system.
However, every effort was made to ensure talking was done only between musical interactions,
so as not to disturb the ‘flow’ of interaction. Following the interactive component, users were
debriefed on their overall impression of the various prototypes.
6.30: Screen captures of User #1 (left) and User #2 (right) performing
CHAPTER 6: ITERATIVE DESIGN
83
6.31: User #1 performing alongside a laptop running Ableton
6.4.4.4. Discussion of Results
6.4.4.4.1. Tangible Aspects: Embodiment
User 1 saw the embodied mode of interaction as extremely refreshing, as he said that he was used
to continually mouse clicking during his performances. However, both users found the minimal
setup to be performatively restrictive, particularly with one RFID reader per track being used to
read both the dice and instruments. User 2, in particular, wanted more tracks and sensors spread
right across the table, as he said this would further encourage a performer to explore the
performance space.
6.4.4.4.2. Expressiveness and responsiveness
Both users found the RFID and the distance sensors to be pleasingly responsive. The level of
expressive control afforded by the distance sensor was of particular delight. Regarding a visual
connection to sound, user 2 said that the performer's movements of objects correlate exceptionally
well with the resultant changes to the sound.
CHAPTER 6: ITERATIVE DESIGN
84
6.4.4.4.3. Changing Samples with Objects.
User 2 considered that the connection of toy instruments to sounds was too obvious to be
enjoyable. He suggested that abstract shapes would be more visually intriguing to an audience,
and lend more to the performer's enjoyment by way of a degree of uncertainty and risk. User 1
was not as critical, but did comment that the instruments seemed a little "silly", but in a playful
way that might suit the performance of certain musical genres more so than others.
6.4.4.4.4. Dice
Both users appreciated the fact that the dice controlled soloing and muting; however, they
wondered why the tangible instrument objects themselves did not control this; i.e. if an object
were present at a reader, then its sound would play, if not then no sound should be heard. Aside
from this, the dice were commended for elegantly facilitating six different functionalities using
only a single object.
User 2 found particular interest in the polyrhythmic application of the dice, as he opined that this
performance technique appealed more to the musically adept. When comparing their use for
creating polyrhythms to changing clips in the live set, he said that there is "less but it feels like
more". He considered that the role of the dice to change poly rhythms was more understandable
to him than the way in which they were applied in the live set to change clips.
6.4.4.4.5. Reliance on the Screen
Both users highlighted that the system thoroughly reduced the need to reference the screen. User
1 saw this as particularly advantageous, as he said that he makes a concerted effort to avoid screen
gazing during his Ableton live performances.
6.4.4.4.6. Technical Shortcomings
Several technical shortcomings were observed that inhibited performance in one way or another:
• Both users considered the cables to be a hindrance to movement and suggested opting for
a wireless approach.
• User 1 saw the friction involved in sliding the RFID readers as a physical limitation free
and expressive movement, and instead suggested building an enclosure with wheels.
CHAPTER 6: ITERATIVE DESIGN
85
• As a further level of control, user 2 recommended incorporating a gyroscope into the
reader circuit, so that rotational angle could also be factored in.
6.4.4.4.7. User Impressions of Overall concept
User 1 stated that “the inclusion of objects encouraged childlike playfulness, while the technology
behind it appeals to the adult.” Both users considered it to be fun and playful through its
involvement of physically moving objects around in order to affect the music. User 2 opined that
this could be made into a marketable product if more variety and control was provided. Both users
said that that, although the technology was crude, it was presented well.
6.4.5. Design the Final System
6.4.5.1. Review Feedback From Use Testing
Feedback from the previous expert users formed the basis for the design of a much more refined
final prototype. The level of success in attempting to implement this feedback is distilled into the
below table (Table 4):
User
Recommendation
Was it
Addressed?
How Was it Addressed?
Minimal setup was
too restrictive
YES Separate RFID readers were provided for
instruments and dice
More tracks desired YES One more track added. (Mounting costs prevented
any further additions)
Tracks should mute
when no objects are
present
YES Arduino code adjusted to send a special MIDI
message when no tag is present at reader.
Friction inhibits
movement of readers
YES Smooth felt pads affixed to bottom of enclosures to
allow smooth sliding
Wireless enable. NO Implemented via Bluetooth but shelved due to bugs.
Implement
gyroscope for more
control
NO Tech failure due to faulty gyro unit meant that there
was no time to implement.
CHAPTER 6: ITERATIVE DESIGN
86
Swap obvious sound
triggering tangibles
(toy instruments) for
abstract ones.
YES At D.A.W.N. the DJ version was also demonstrated
in order variety of sound objects will be made
available to users, and their opinions win each
method evaluated. Table 4: Feedback from user testing considered
The primary design decision at this point was to add several more readers in order to separate the
tasks of changing sounds and changing clips more elegantly. Four readers were assigned to control
sounds and four to control clips25.
Only the readers that changed sounds would be required to move independently of one another,
so each of these was to be housed within a separate enclosure. The four dice readers, on the other
hand, were not involved in mixing and so could remain stationary (6.32).
6.32: Top down depiction of stationary dice readers and mobile instrument readers
6.4.5.2. Exploring a Wireless Solution
As outlined in the previous evaluation, the wires restricted the movement of the mobile readers.
In order to address this, wireless alternatives for the four mobile enclosures were explored. The
first was via Bluetooth using the Adafruit Bluefruit Feather LE26 microcontroller (6.33). These
were deemed to be ideal as they have the same microchip as the Arduino Pro Micro, therefore
25 Four was deemed a sufficient number of tracks for musical results within Ableton without incurring too much
financial expense.
26 https://learn.adafruit.com/adafruit-feather-32u4-bluefruit-le/overview
CHAPTER 6: ITERATIVE DESIGN
87
they also have the potential to send MIDI to Ableton without requiring an intermediary
application. Two Pro Micros were swapped for Bluefruit Feather LEs and tested. These worked
well, however it was observed that the Bluetooth connection occasionally dropped when both
units were in operation at once. Without sufficient time to debug, this solution was deemed too
unstable to implement in the final prototype.
6.33: Adafruit Bluefruit Feather LE microcontroller
Another attempt at permitting a wireless setup was done using the Adafruit Feather Huzzah
ESP826627 WIFI microcontroller (6.34). However, in this case, the technology failed due to the
microcontrollers incompatibility with the library in use for the ping sensor. The ESP8266 has a
different timing chip to those found in the ATmega32U4 based Pro Micros, meaning that in order
to read the pings a delay needed to be introduced into the code. This workaround was not deemed
acceptable, as such a delay would also affect the responsiveness of the RC522 sensor which was
contained within the same code.
6.34: Adafruit ESP8266 microcontroller powered independent of USB via external battery pack
27 https://www.adafruit.com/product/2821
CHAPTER 6: ITERATIVE DESIGN
88
6.4.5.3. Design using 3D Modelling
To accommodate tangibles of varying sizes, the required enclosures were would need to provide
ample surface area. The correct dimensions for this were explored initially by using temporary
enclosures constructed from Tupperware (6.35). Experimentation with tangibles of different sizes
indicated that a surface area of roughly 120mm x 120mm would be ideal.
6.35: Using Tupperware to test ideal enclosure dimensions
Freehand sketches then explored the possible configuration of the system (6.36).
6.36: Sketches of system configuration
Sketches were then converted into more detailed 3D models (6.37).
CHAPTER 6: ITERATIVE DESIGN
89
6.37: 3d model of system
As depicted in 6.38, Dice reader #1 would change the pattern of instrument reader #1, dice reader
#2 would change the pattern of instrument reader #2, and so forth.
6.38: performers view of 3d model
6.4.5.4. Fabrication Of Final System
6.4.5.4.1. Constructing Mobile Enclosures
The enclosures were cut from 4mm thick MDF material (6.39). A height of 40mm was deemed
sufficient to ensure that the units were stable while providing adequate space for interior
electronics.
CHAPTER 6: ITERATIVE DESIGN
90
6.39: Enclosures were laser cut from MDF
6.4.5.4.2. Constructing Stationary Dice Enclosure
Continual tag reading errors resulted when more than one RFID reader was used with a single
Arduino. Therefore, even though all readers were housed within a single unit, four separate
Arduinos were still required (6.40).
6.40: Constructing the dice reader enclosure
CHAPTER 6: ITERATIVE DESIGN
91
6.4.5.4.3. Aesthetic Considerations
Since its launch, Ableton's overall aesthetic has maintained a clean, minimalistic look that features
liberal use of the colour grey. This theme is echoed in both its GUI and its promotional material
(6.41). As this research concerns the design of a control system that is intended exclusively for
use with Ableton, considerable effort was made to ensure that the design of the final prototype
was faithful to this aesthetic.
6.41: Ableton GUI and packaging with grey aesthetic (lelong.com n.d.)
Ableton’s logo is similarly minimalistic and clean. It was used as a reference when designing
GraspAbleton’s logo. As seen in 6.42, the hand icon and the font borrow heavily from it.
6.42: Ableton logo (left) (Sonarplusd.com n.d.) and GraspAbleton logo (right)
The surface of each reader was then embellished with this hand icon (6.43). This was not just for
aesthetic or branding purposes, but also to provide a signifier that would guide the user in placing
a tagged object on the correct spot.
CHAPTER 6: ITERATIVE DESIGN
92
6.43: Stencil logo design was scrapped in favour of a much cleaner adhesive label.
6.44: Audience perspective of final system (minus tangibles)
6.4.5.5. Finalised System for Display at D.A.W.N. Exhibition
The final system was subjected to further usability scrutiny when displayed at the D.A.W.N.
2018 exhibition. As the system would now be subjected to prolonged use, every effort was made
to first ensure that it was made robust enough to cope with sustained abuse. Fixtures that would
hold the sensors more securely in place were 3D printed (6.45) and all electronic components
were soldered permanently (6.46).
CHAPTER 6: ITERATIVE DESIGN
93
6.45: 3D printed fixtures
6.46: Fully wired and soldered system
The system was set up to function within a 2-foot x 2-foot perimeter; an area ergonomically suited
to comfortably maneuvering the objects without strain. Longer and more flexible USB cables
were implemented facilitating more liberal user movement in the absence of a wireless setup.
6.47: Final system (complete with tangibles) as displayed during D.A.W.N.
CHAPTER 6: ITERATIVE DESIGN
94
In previous versions, users were restricted to atonal percussive instruments and patterns,
however for D.A.W.N. one track was instead designated to play melodic patterns with a choice
of three synthesizer sounds. This addition allowed for more musically rich compositions to be
performed. Colored dice were also used in place of numbered dice (as recommended by several
users in previous evaluations). Each colored side would trigger a corresponding preprogrammed
pattern of a matching color within Ableton (6.48).
6.48: Coloured dice (left) corresponding to coloured Ableton clips (right)
Users were given a choice to perform either a ‘live set’ (described in section 6.4.2) with tangible
instruments or a DJ set with the DJing tangibles (described in section 6.4.3).
6.4.5.5.1. Final Evaluation
Previously, the evaluation of prototypes had primarily been carried out from the perspective of the
performer. The D.A.W.N. exhibition served as an opportunity to probe the effectiveness of the
design from a spectator’s point of view, as well as from that of the performer. GraspAbleton was
set up in a sound proof room with ample floor space to accommodate an audience of 20 or so
people. Exhibition attendees were invited to either perform with the system or observe others
doing so, after which they were asked to discuss their overall impressions. Lines of interrogation
were highly informal so as to encourage open and thought-provoking responses. The following
section will outline the findings from these questions and observations, paying particular attention
to aspects that may need to be redesigned in future.
6.4.5.5.1.1. Audience Spectacle Related Findings
All respondents noted that they found the experience of watching others perform with
GraspAbleton to be both intriguing and visually stimulating. Another positive element was that
respondents deemed the performer’s actions to correlate well with ensuing musical events. This
CHAPTER 6: ITERATIVE DESIGN
95
satisfied one of the key design goals which was making the performance more legible than it
would be in a traditional laptop based Ableton performance.
Though audience feedback was overwhelmingly positive, several shortcomings also came to light
during D.A.W.N. that require further examination. One user pointed out that, because the system
was so visually engaging for the performer as well as the audience, he or she may become overly
immersed in the array of colorful objects and therefore become less concerned with audience
interaction. On reflection, this revelation may indeed be true for the first-time user, however it is
possible that once the user becomes familiar with the look and feel of the system they may able
to engage more directly with the audience (much like the way in which a competent guitarist no
longer needs to visually reference the guitar strings when performing).
Another issue on the point of visibility came to light while the author himself was attempting to
video record a performance from the rear of a group of 15 spectators. It was noted that, from this
position the table top was heavily obscured from the view of everyone except those towards the
very front (6.49). This made it clear that the performance may be visible to a much smaller
audience than anticipated.
6.49: Even with relatively small audiences, visibility of system is affected for those at rear
6.4.5.5.1.2. Performance Related Findings
Subjecting the system to a broad demographic of users at D.A.W.N. also revealed new and
previously undiscovered performance techniques. For example, when performing with the DJ
system, one user began trying to emulate a how a hip-hop DJ would scratch a record on a turntable
CHAPTER 6: ITERATIVE DESIGN
96
(6.50). She did this by rapidly sliding a tangible back and forth over the sensor. This technique re-
triggered the beginning of the chosen song each time it passed over producing a previously
undiscovered stuttering effect that was musically pleasing.
Another unforeseen performance element was observed through users breaking the rules,
deliberately or otherwise. Despite knowing that certain tracks were intended to play only
percussive instruments and others only melodic ones, several users chose to place instruments on
incompatible tracks. Though the system was not programmed to function in such a way, the
resultant musical sequences often turned out to be quite interesting (for example, a kick drum
playing a melodic arpeggiated sequence). This indicates that the system may also be well suited
to improvisation and discovery, not just pre-prepared sequences and sounds.
Despite previous user recommendations, the colored dice were not appreciated or understood by
all users, with some suggesting that numbers would be more appropriate for those not familiar
with Ableton.
6.50: One user discovers a new DJing technique
6.4.5.5.1.3. Target Demographic Reconsidered
At D.A.W.N. it was found that GraspAbleton was particularly appealing to children. It was noted
that, while very young users did not always understand exactly what they were doing, they seemed
to derive great joy from interacting with the toy-like objects and colorful dice. Several adult users
also noted that their own young children would enjoy using it. This certainly warrants further
exploration, as adults and not children were the initial target user group.
CHAPTER 7: CONCLUSION & FUTURE WORK
98
7. CHAPTER 7: CONCLUSION & FUTURE WORK
7.1. Discussion
This concluding chapter discusses the overall findings of this research by referring to the
theories outlined in chapter three. The potential extension of this present work is then discussed
and finally, the overall implications of this research are outlined in the context of MIDI
controller design for DAWs.
7.1.1. Outline of Project
The primary research question at the outset of this thesis was if an alternative performance
control system for DAWs could be more bodily engaging than traditional controllers. A
secondary question considered whether this proposed system could also benefit the audience
spectacle, whereby the performer's actions and the interface itself could be made more visible.
Another concern was eliminating the need to refer to the laptop screen during a performance.
Based on a review of relevant literature, TUIs were shown to offer considerable potential as a
platform on which to base these explorations. However, research also demonstrated a lack of
knowledge pertaining to their use with DAWs, prompting an orthodox and imaginative design
methodology that was conducted in two main stages:
1. A technology-focused brute force attack on the issue, grounded in empirical and
theoretical research.
2. A focused iterative design phase whereby the most promising concept from the attack
mentioned above was further explored, evaluated and refined.
7.1.2. Discussion of Findings
During this research, qualitative data from user evaluation was gathered in a continuous process
utilising a variety of methods, which provided a robust means of analysing all fidelities of
design. This data indicates firstly that, the final artefact ‘GraspAbleton’ satisfies the primary
concern of this thesis, in that it involves a high degree of bodily engagement, and therefore
CHAPTER 7: CONCLUSION & FUTURE WORK
99
confounds the paradigm of micro-movement based controllers outlined by scholars such as
Djajadiningrat et al. (2007). Seasoned Ableton users found the physical act of moving objects
around to be not just a novelty, but a refreshing alternative to standard means of input. Data
indicates that this movement could also be of benefit to audience enjoyment of a performance,
as users gave favourable feedback from an audience perspective that the system was an
engaging spectacle to behold.
Observations and user accounts also attest that the system profoundly reduced the need to
reference a computer screen, which facilitates banishing the laptop from Ableton performances
entirely. This may be advantageous in tackling issues such as performer distraction (White
2013) and audience ill will towards laptop performance (Cascone 2014; Pedersen, Hornbaek
2007). The laptop-less setup also potentially enhances the audience spectacle, where the
performer is visibly more engaged with the performance than with a screen. Lastly, the array
of curious objects that form the control mechanism for the system were deemed lend a further
layer of intrigue to the spectacle from the perspective of an onlooker.
7.2. Contribution
This thesis has contributed to research on the potential application of TUIs in music
performance in a number of ways. Most importantly, it has demonstrated that there is
considerable value in marrying TUIs with DAWs. MIDI controllers are perpetually stuck in
the paradigm of sound engineering-based controls, which are highly unsuited to music
performance (Butler 2014). This thesis has demonstrated that TUIs can go some ways towards
transcending these restrictions by merging the encouragement of bodily movement with the
invitation to freely explore the performance space. This alternative approach to control over
DAWs serves as fertile ground for other designers to begin reevaluating deficiencies in how
we interact with these systems.
Secondly, the design methodology of this research has shown that there are many advantages
in adhering to a 'sketching in hardware' approach to TUI design. While in reality, much of this
exploration was guided by literature and references to notable TUIs of the past; this thesis has
also demonstrated that much value was derived from attacking various possibilities through
technological exploration. Though it transpired that several of the resultant prototypes during
this stage were of little use in a music performance context (e.g. 5.3.3.1, 5.3.3.2), each foray,
CHAPTER 7: CONCLUSION & FUTURE WORK
100
no matter how obscure, unlocked new knowledge that in some way benefitted subsequent
designs.
Admittedly, the research from the audience perspective needs further attention; however,
research conducted so far indicates that the alternative spectacle that a TUI-based controller
brings to Ableton performance has considerable potential. This lifts of the veil on computer
music performance, whereby the performer is no longer seemingly conducting secretive
maneuvers in the darkness but instead sharing the performance overtly with the audience. I
believe that this can go some ways to redressing the noted lack of trust and reputability in
computer-mediated music performances.
7.3. Future Work
Certain variations and extensions of the core research topic that might hold fruit for alternative
applications, which for one reason or another extend beyond the scope of this paper, are
discussed in this section.
7.3.1. The Attachment of Meaning
Chapter 3 (3.3.1.3) demonstrated that tangible tokens in a TUI system could be constructed
from virtually any physical object that one desires. While this thesis has explored that concept
using RFID tags to construct tokens from everyday objects, it has only scratched the surface of
the potential possibilities. GraspAbleton currently employs highly obvious toy instruments as
the primary means to switch between sounds. However, as outlined in the evaluation stage of
chapter 6 (6.4.4.4), one user opined that this obvious system of association did not appeal to
him and he instead suggested that abstract shapes would be more engaging both to the
performer and the audience. Unfortunately, efforts since then have been focused on finalising
the final build of GraspAbleton, so there was no time to explore this line of inquiry further.
The tangible djing prototype in chapter 6 (6.4.3) was intended to provoke insightful discussion
on the matter. However, even though it was demonstrated both on social media and to test
candidates, it did not lead to a conclusive outcome. D.A.W.N. was utilised as a platform to
assess impressions of meaningful associations from a wider variety of perspectives but
unfortunately it did not provide enough conclusive data.
CHAPTER 7: CONCLUSION & FUTURE WORK
101
7.3.2. Group Collaboration
As outlined in chapter 6 (6.2.3), it was observed during testing that several users began
interacting with the prototypes as a group without being prompted to do so. This also occurred
several times during the D.A.W.N. exhibition (7.1) indicating that the system may have
considerable potential for a collaborative group music performance. Studies by Hornecker
(2002) and Xambo et al. (2013) have found that TUIs make user movements understandable
and allow for face to face communication, thus making them highly suitable for group
activities. TUIs for music performance can “invite musical collaboration, as their various
components readily tempt multiple participants to pick them up and join in, perhaps a
harbinger of the jam session of the future” (Paradiso and O’Modhrain 2003). Unfortunately,
this area extends beyond the scope of current research, though the aforementioned
observations mean that this area certainly warrants further probing.
7.1: Some of the unprompted collaborations that took place during D.A.W.N.
BIBLIOGRAPHY
102
8. BIBLIOGRAPHY
Ableton (2012). ‘Ableton Artist Quotes’, Ableton.com, available:
https://www.ableton.com/en/pages/artists/artist_quotes/ [Accessed 11 July 2018].
Ableton (2018), Ableton Push, available: https://www.ableton.com/en/push/ [accessed 15Aug
2018].
Ableton (2018) Ableton Push [image], available: https://www.ableton.com/en/push/
Alonso, M.B., Keyson, D.V. (2005) ‘MusicCube : Making Digital Music Tangible’,
Interfaces, 1176–1179, available:
http://portal.acm.org/citation.cfm?doid=1056808.1056870.
Arnall, T., Martinussen, E.S. (2010) ‘Depth of Field Discursive design research through
film’, FORMakademisk, 3(1), 100–122, doi: https://doi.org/10.7577/formakademisk.189
Als, B.S., Jensen, J., and Skov, M. (2005) ‘Comparison of think-aloud and constructive
interaction in usability testing with children’, Proceedings of the 2005 conference on
Interaction design and children (IDC '05), 9-16, available:
DOI=http://dx.doi.org/10.1145/1109540.1109542
Bauer, T. (2010) ‘Prototyping in physical computing - Sketching in Hardware’, Prototyping -
An overview of current trends, developments, and research in Prototyping, 96-102,
available:
https://pdfs.semanticscholar.org/f0c7/f64482c6381817bc8510556d54df08340f09.pdf
[Accessed 27 Jul. 2018].
Bennett, P. D. (2010) The Representation and Control of Time in Tangible User Interfaces -
Designing Musical Instruments for the Manipulation of Temporal Media, unpublished
thesis (Ph.D.), Queens University Belfast.
BIBLIOGRAPHY
103
Berndt, A., Waloschek, S., Hadjakos, A. (2016) ‘Hand Gestures in Music Production’, Proc.\
of the Int.\ Computer Music Conf.\ (ICMC), 449–454.
Bias Toward Action (n.d.) d-school Stanford, available: https://dschool-
old.stanford.edu/groups/k12/wiki/548fb/Bias_Toward_Action.html [accessed 29 Jul
2018].
Bush, K. (1980) Babooshka [LP], Netherlands: EMI.
Butler, M.J. (2014) Playing with Something That Runs: Technology, Improvisation, and
Composition in DJ and Laptop Performance, Oxford: Oxford University Press.
Carvey, A., Gouldstone, J., Vedurumudi, P., Whiton, A., and Ishii, H. (2006) ‘Rubber shark
as user interface’, CHI’06 extended abstracts on Human factors in computing systems,
634–639.
Cascone, K. (2002) ‘Laptop Music – counterfeiting aura in the age of infinite reproduction’,
Parachute Contemporary Art, Issue 107, 52–59.
Collins, N., McLean, A., Rohrhuber, J., Ward, A. (2003) ‘Live coding in laptop
performance’, Organised Sound 8, 321–330, available:
doi:10.1017/S135577180300030X
Convivial Studio (2017) Object Interaction With Touchscreens, Instructables.com, available:
https://www.instructables.com/id/Object-Interaction-With-Touchscreens/ [Accessed 1
Apr. 2018].
Cooke, G. (2011) 'Liveness and the machine: improvisation in live audio-visual
performance', Screen Sound, vol. 2, pp. 9-26.
Creative Applications Network (2009) Skål [image], available:
http://www.creativeapplications.net/objects/skal-objects/
Dalton, N., MacKay, G., Holland, S. (2012) ‘Kolab’, Proceedings of the Designing
Interactive Systems Conference on - DIS ’12, (May 2014), 21, available:
http://dl.acm.org/citation.cfm?doid=2317956.2317960.
BIBLIOGRAPHY
104
Davidson, J. W. (2011) Movement and collaboration in musical performance, Oxford
Handbook of Music Psychology, Oxford: Oxford University Press.
Designboom (2010) IDEO c60 Music Platform [image], available:
https://www.designboom.com/technology/ideo-c60-music-platform/
Djajadiningrat, T., Matthews, B. and Stienstra, M. (2007) ‘Easy doesn’t do it: Skill and
expression in tangible aesthetics’, Personal and Ubiquitous Computing, 11(8), 657–676,
available: doi: 10.1007/s00779-006-0137-9.
Evans, J.R., Mathur, A. (2005) ‘The value of online surveys’, Internet Research, 15(2), 195–
219.
Fatbraintoys (2018) Music Blocks [image], available:
https://www.fatbraintoys.com/toy_companies/small_world_toys/neurosmith_music_blo
cks.cfm
Fiebrink, R., Wang, G., Cook, P. (2007) ‘Don’t forget the laptop: using native input
capabilities for expressive musical control’, Conference on New Interfaces for Musical
Expression (NIME-07), 4, available: http://dl.acm.org/citation.cfm?id=1279771.
Fitzmaurice, G.W., Ishii, H., Buxton, W. a. S. (1995) ‘Bricks: laying the foundations for
graspable user interfaces’, SIGCHI Conference on Human Factors in Computing
Systems, 442–449, available: doi:10.1145/223904.223964
Gaver, B., Bowers, J. (2012) ‘Annotated portfolios’, Interactions, 19, 40,
doi:10.1145/2212877.2212889
Gelineck, S., Serafin, S. (2009) ‘A Quantitative Evaluation of the Differences between Knobs
and Sliders’, Proceedings of the International Conference on New Interfaces for Musical
Expression, 13–18, available:
http://www.nime.org/proceedings/2009/nime2009_013.pdf.
Hartmann, B. (2011). C60 – Evolution of an Idea | IDEO Labs. [online] Labs.ideo.com.
Available at: https://labs.ideo.com/2011/01/14/c60-evolution-of-an-idea/ [Accessed 19
Jun. 2018].
BIBLIOGRAPHY
105
Hayes, L. (2012) ‘Performing Articulation and Expression through a Haptic Interface’,
Proceedings of the International Computer Music Conference, (March), 400–403,
available:
http://scholar.google.com/scholar?hl=en&btnG=Search&q=intitle:PERFORMING+AR
TICULATION+AND+EXPRESSION+THROUGH+A+HAPTIC+INTERFACE#0.
Holmquist, L.E., Redström, J., Ljungstrand, P. (1999) ‘Token-Based Access to Digital
Information’, HUC 1999, 234–245, doi:10.1007/3-540-48157-5_22
Holmquist, L. E., (2006) ‘Sketching in hardware’, Interactions 13, January 2006, 47-60. doi:
https://doi.org/10.1145/1109069.1109101.
Hornecker, E. (2002) ‘Understanding the benefits of graspable interfaces for cooperative
use’, Proc. of 5th International Conference on the Design of Cooperative Systems, 71–
87
Hornecker, E. (2008) ‘“I don’t understand it either, but it is cool” - Visitor interactions with a
multi-touch table in a museum’, 2008 IEEE International Workshop on Horizontal
Interactive Human Computer System, TABLETOP 2008, 113–120.
Hunt, A., Wanderley, M.M., Paradis, M. (2002) ‘The Importance of Parameter Mapping in
Electronic Instrument Design’, Proceedings of the 2002 Conference on New Interfaces
for Musical Expression, 32(4), 149–154, available:
http://portal.acm.org/citation.cfm?id=1085207%5Cnhttp://www.nime.org/2004/NIME02
/hunt.pdf.
Hurtienne, J., Stößel, C., Weber, K. (2009) ‘Sad is heavy and happy is light: population
stereotypes of tangible object attributes’, Tangible and embedded interaction, available:
http://portal.acm.org/citation.cfm?id=1517664.1517686.
Interaction Design Foundation (2018) How to Conduct User Observations, available:
https://www.interaction-design.org/literature/article/how-to-conduct-user-observations
[Accessed 29 Jul 2018].
BIBLIOGRAPHY
106
Ishii, H., Mazalek, A., Lee, J. (2001) ‘Bottles as a minimal interface to access digital
information’, CHI ’01 extended abstracts on Human factors in computing systems, 187–
188.
Ishii, H. and Ullmer, B. (1997) ‘Tangible bits: towards seamless interfaces between people,
bits, and atoms’, Proceedings of the 8th international conference on Intelligent user
interfaces, March, 3–3, available: http://doi.acm.org/10.1145/604045.604048.
Jack, R.H., Stockman, T., McPherson, A. (2016) ‘Effect of latency on performer interaction
and subjective quality assessment of a digital musical instrument’, Proceedings of the
Audio Mostly 2016 on - AM ’16, (October 2017), 116–123, available:
http://dl.acm.org/citation.cfm?doid=2986416.2986428.
Jordà, S., Kaltenbrunner, M., Geiger, G., Alonso, M. (2006) ‘The reacTable’, ACM
SIGGRAPH 2006 Sketches on - SIGGRAPH ’06, 91, available:
doi:10.1145/1179849.1179963
Jordà, S. (2008) “On stage: The reactable and other musical tangibles go real,” International
Journal of Arts and Technology (IJART), vol. 1, no. 3/4, 268–287
Jordan, B. (1996) ‘Ethnographic workplace studies and CSCW’, Human Factors in
Information Technology, 12, 17–42.
Kavinsky (2010) Nightcall [CD], France: Record Makers
Kelley, K., Clark, B., Brown, V., Sitzia, J. (2003) ‘Good practice in the conduct and reporting
of survey research’, International Journal for Quality in Health Care, 15(3), 261–266.
Kitchenham, B.A., Pfleeger, S.L. (2002) ‘Principles of survey research’, ACM SIGSOFT
Software Engineering Notes, 27(2), 20, available:
http://portal.acm.org/citation.cfm?doid=511152.511155.
Le Goc, M., Kim, L.H., Parsaei, A., Fekete, J.-D., Dragicevic, P., Follmer, S. (2016)
‘Zooids’, Proceedings of the 29th Annual Symposium on User Interface Software and
Technology - UIST ’16.
BIBLIOGRAPHY
107
Lelong.com (n.d.) Ableton Live [image], available: https://www.lelong.com.my/ableton-live-
9-suite-upgrade-live-intro-musicbliss-F592753-2007-01-Sale-I.htm [Accessed 7 Aug
2018].
Levin, G., 2006. The Table is The Score: An Augmented-Reality Interface for Real-Time,
Tangible, Spectrographic Performance. Icmc 2006 151–154.
Macaranas, A., Antle, A.N., Riecke, B.E. (2012) ‘Bridging the Gap: Attribute and Spatial
Metaphors for Tangible Interface Design’, Proceedings of the Sixth International
Conference on Tangible, Embedded and Embodied Interaction - TEI ’12, 1(212), 161–
168.
Mackay, W.E. (2002) Using video to support interaction design, INRIA and ACM/SIGCHI,
available: https://www.lri.fr/~mackay/VideoForDesign/print/print.pdf
Marshall, P., Price, S., Rogers, Y. (2003) ‘Conceptualising tangibles to support learning’,
IDC ’03: Proceedings of the 2003 conference on Interaction design and children, 101–
109, available: doi:10.1145/953536.953551
Marshall, P., Rogers, Y. and Hornecker, E. (2007) ‘Are tangible interfaces really any better
than other kinds of interfaces?’, CHI’07 workshop on Tangible User Interfaces in
Context & Theory, 28
Michaels, S. (2013) ‘Disclosure admit miming at Wembley – but say they didn't want to’,
The Guardian, 11 Jun, available:
https://www.theguardian.com/music/2013/jun/11/disclosure-admit-miming-wembley-
summertime-ball [Accessed 14 July 2018].
Mok, K. (2016) ‘Sound Geeks Are Resurrecting the Analog Synthesizer’, The New Stack,
available: https://thenewstack.io/sound-geeks-resurrecting-analog-synthesizer/
[Accessed 11 July 2018].
Motörhead (1980) The Ace of Spades [LP], UK: Bronze
Mugellini, E., Rubegni, E., Gerardi, S., Khaled, O.A. (2007) ‘Using personal objects as
tangible interfaces for memory recollection and sharing’, Proceedings of the 1st
BIBLIOGRAPHY
108
international conference on Tangible and embedded interaction - TEI ’07, (January),
231, available: http://portal.acm.org/citation.cfm?doid=1226969.1227016.
Newton-Dunn, H., Nakano, H. and Gibson, J. (2003) ‘Block Jam: A Tangible Interface for
Interactive Music’, Proceedings of the International Conference on New Interfaces for
Musical Expression, 170–177, available: doi: 10.1076/jnmr.32.4.383.18852.
Nielsen, J. (2011) Parallel & Iterative Design + Competitive Testing = High Usability,
Nielsen Norman Group, available: https://www.nngroup.com/articles/parallel-and-
iterative-design/ [Accessed 26 July 2018]
Norman, D. (1988). The Design of Everyday Things, New York: Basic Books.
Novation (2018) Novation Launchpad [image], available:
https://global.novationmusic.com/launch/launchpad# [accessed 20 Jun 2018].
Novick, G. (2008) ‘Is There a Bias Against Telephone Interviews In Qualitative Research?’,
Research in Nursing and Health, 31(4), 391-398, doi: 10.1002/nur.20259
O’Modhrain, S. (2011) ‘A Framework for the Evaluation of Digital Musical Instruments’,
Computer Music Journal, 35, 28–42, available: doi:10.1162/COMJ_a_00038
Ostertag, B. (2002) ‘Human bodies, computer music’, Leonardo Music Journal 12, 11–14,
available: doi:10.1162/096112102762295070
Paradiso, J. (1999) ‘The Brain Opera Technology: New instruments and gestural sensors for
musical interaction and performance’, Journal of New Music Research, Vol. 28, No. 2,
130-149.
Paradiso, J., Hsiao, K., Benbasat, A. (2000) ‘Musical Trinkets: New Pieces to Play’,
SIGGRAPH 2000 Conference Abstracts and Applications, July 2000
Paradiso, J.A., O’Modhrain, S. (2003) ‘Current Trends in Electronic Music Interfaces’,
Journal of New Music Research, 32(4), 345–349, available:
http://www.tandfonline.com/doi/abs/10.1076/jnmr.32.4.345.18855.
BIBLIOGRAPHY
109
Patten, J., Ishii, H. (2007) ‘Mechanical constraints as computational constraints in tabletop
tangible interfaces’, Proceedings of the SIGCHI conference on Human factors in
computing systems - CHI ’07, (May 2014), 809, available:
http://portal.acm.org/citation.cfm?doid=1240624.1240746.
Pedersen, E.W., Hornbæk, K. (2009) ‘mixiTUI’, Proceedings of the 3rd International
Conference on Tangible and Embedded Interaction - TEI ’09, (April), 223, available:
http://dl.acm.org/citation.cfm?id=1517664.1517713.
Polynor, R. (1995) ‘The Hand That Rocks the Cradle’. I.D., May/June 1995, 60-65.
Reactable (2018) Reactable [image], available: http://reactable.com/ [accessed 08 Jul 2016].
Sengers, P., Gaver, B. (2006) ‘Staying open to interpretation: engaging multiple meanings in
design and evaluation’, Proceedings of the 6th conference on Designing Interactive
Systems, 99–108, available: http://dl.acm.org/citation.cfm?id=1142422.
Schon, D.A. (1984) The Reflective Practitioner, New York: Basic Books.
Shaer, O. (2009) ‘Tangible User Interfaces: Past, Present, and Future Directions’,
Foundations and Trends in Human–Computer Interaction, 3(1–2), 1–137, available:
http://www.nowpublishers.com/article/Details/HCI-026.
Slater, M.R. (2016) ‘The Untold Story of Ableton Live—the Program That Transformed
Electronic Music Performance Forever’, Thump, available:
https://thump.vice.com/en_us/article/78je3z/ableton-live-history-interview-founders-
berhard-behles-robert-henke [Accessed 12 July 2018].
Sonarplusd.com (n.d.) Ableton Logo [image], available:
https://sonarplusd.com/en/programs/barcelona-2018/organizations/ableton [Accessed 7
Aug 2018].
Sosoka, J., Abercrombie, B., Emerson, B., and Gerstein, A. (2002) ‘Educational Music
Instrument for Children’, 6,353,168, March 5, 2002.
BIBLIOGRAPHY
110
Stowell, D., Plumbley, M.D., Bryan-Kinns, N. (2008) ‘Discourse analysis evaluation method
for expressive musical interfaces’, Proc. of NIME’08, 81–86.
Stowell, D., Robertson, A., Bryan-Kinns, N., and Plumbley, M. D. (2009) ‘Evaluation of live
human-computer music-making: quantitative and qualitative approaches’, Int. Journal of
Human-Computer Studies, 67(11), 960–975
Tregoning, J. (2012) ‘Fake DJs: A Brief History’, Inthemix, available:
http://inthemix.junkee.com/fake-djs-a-brief-history/16543 [Accessed 14 July 2018].
Tsay, C.-J. (2013) ‘Sight over sound in the judgment of music performance’, Proceedings of
the National Academy of Sciences, 110(36), 14580–14585, available:
http://www.pnas.org/lookup/doi/10.1073/pnas.1221454110.
Ullmer, B., Ishii, H. (2000) ‘Emerging frameworks for tangible user interfaces’, IBM
Systems Journal, 39(3.4), 915–931, available:
http://ieeexplore.ieee.org/document/5387042/.
Ullmer, B., Ishii, H., Jacob, R.J.K. (2005) ‘Token+constraint systems for tangible interaction
with digital information’, ACM Transactions on Computer-Human Interaction, 12(1),
81–118, available: http://portal.acm.org/citation.cfm?doid=1057237.1057242.
Underkoffler, J., Ishii, H. (1999) ‘Urp: A luminous-tangible workbench for urban planning
and design’, Proceedings of the SIGCHI Conference on Human Factors in Computing
Systems: The CHI Is the Limit, 386–393, available:
http://doi.acm.org/10.1145/302979.303114.
van den Haak, M.J., de Jong, M.D.T. & Schellens, P.J. (2004) ‘Employing think-aloud
protocols and constructive interaction to test the usability of online library catalogues: a
methodological comparison’ Interacting with computers, 16(6), 1-18.
DOI: 10.1016/j.intcom.2004.07.007
Villierezz, P.(n.d.) ‘Panther Panther! Live - Research & development process of a live setup’,
Panther Panther Audio-Visual Artist, available:
https://www.pantherpantherav.com/process [Accessed 10 July 2018].
BIBLIOGRAPHY
111
Whitcomb, M., Daly, B. (n.d.) ‘The DAW and the End Of Time’, Harmonic Distortion, n.d.,
available: https://dnamusiclabs.com/harmonic-distortion/daw-and-end-time [Accessed
19 July 2018].
White, D. (2013) ‘How To Avoid Serato Face: Solving DJ Screen-Gazing’, Dj Tech Tools,
available: https://djtechtools.com/2013/02/04/how-to-avoid-serato-face-solving-dj-
screen-gazing/ [Accessed 14 July 2018].
Winding Refn, N. (2011) Drive [DVD], USA: Film District.
Xambó, A., Hornecker, E., Marshall, P., Jordà, S., Dobbyn, C., Laney, R. (2013) ‘Let’s jam
the Reactable: Peer learning during musical improvisation with a tabletop tangible
interface’, ACM Trans. Comput.-Hum. Interact., 20(6), 34.
Yamaha (2018) ‘The Merits of Digital Sound’, Yamaha Pro Audio, available:
http://www.yamahaproaudio.com/global/en/training_support/better_sound/part1/06/
[Accessed 12 July 2018].
APPENDICES
112
9. APPENDICES
9.1. Appendix A: Survey
9.1.1. Questions and Responses Outlined
Question 1:
I much prefer using a mouse and keyboard than a MIDI controller when performing with Live.
Response 1:
Question 2:
My set is very unplanned and I am constantly improvising during a live performance with Ableton
Live.
Response 2:
Question 3:
APPENDICES
113
It is essential that I constantly make eye contact with my audience when performing with Ableton
Live.
Response 3:
Question 4:
A laptop on stage ruins the connection between the audience and I during a performance.
Response 4:
Question 5:
I find I'm not as focused on listening to the music when I am looking at the computer screen while
performing with Live.
Response 5:
Question 6:
When performing with Live, I must always exaggerate movements so as to make it obvious that I
am controlling the music.
APPENDICES
114
Response 6:
Question 7:
I want the audience to see my hands working the controls rather than for them to remain hidden.
Response 7:
9.1.2. Survey Posts on Social Media
APPENDICES
115
9.2. Appendix B: Relevant Code
9.2.1. Code for Early Prototypes
9.2.1.1. iPad Tangibles
9.2.1.1.1. iPad Tangibles Max Patch
9.1 Max MSP patch receiving OSC messages from iPad and converting these to MIDI for Ableton
9.2.1.1.2. iPad Tangibles Openframeworks Code
*This code is available on request as it is much too large to print here.
APPENDICES
118
9.2.1.3. Sample Selection by Weight
9.2.1.3.1. Sample Selection by Weight Arduino Code