Top Banner
Motion Origami Daniel Bartoš 1 1 FAMU, Prague, Czech Republic [email protected] Abstract. The Motion Origami project explores live performance strategies focused on gesture based control of sound. The sound processing involves granular sound synthesis in a Max/MSP patch. Motion Origami is designed for a live performance scenario as an interactive music instrument. Motion Origami makes use of the live performance audio input or of the pre-recorded sound material. This live interface prototype explores a gesture-based music composition and music performance techniques. The sound transformations are driven by the hand gestures, while the use of motion tracking device lets the user build up a specific experience and virtuosity. Keywords: Leap Motion sensor, gesture recognition, motion tracking, music expression, granular synthesis, Max/MSP Introduction The name of the Motion Origami project 1 is inspired by the Japanese art tradition of origami folding. The actual process of paper folding is reflected in a specific compositional strategy, which uses captured hand gestures. In other words the performing artist, the musician, is able to 'fold' sounds with his own hand gestures into new sound objects. The simple 'folds' therefore result into complex soundscapes during the performance. 1 Motion Origami; the project presentation can be accessed online at http://www.danielbartos.com/motion-origami Figure 1. Motion Origami – MAX/MSP patch with the Leap Motion
6

65 Motion Origami DanielBartos - University of Sussexusers.sussex.ac.uk/.../Posters/65_Motion_Origami_DanielBartos.pdf · Motion Origami Daniel Bartoš1 1 FAMU, Prague, Czech Republic

Jun 29, 2018

Download

Documents

vokhanh
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: 65 Motion Origami DanielBartos - University of Sussexusers.sussex.ac.uk/.../Posters/65_Motion_Origami_DanielBartos.pdf · Motion Origami Daniel Bartoš1 1 FAMU, Prague, Czech Republic

MotionOrigamiDanielBartoš1

1FAMU,Prague,[email protected]

Abstract.TheMotionOrigamiprojectexploresliveperformancestrategiesfocusedongesturebasedcontrolofsound.ThesoundprocessinginvolvesgranularsoundsynthesisinaMax/MSPpatch.MotionOrigamiisdesignedforaliveperformancescenarioasaninteractivemusicinstrument.MotionOrigamimakesuseoftheliveperformanceaudioinputorofthepre-recordedsoundmaterial.Thisliveinterfaceprototypeexploresagesture-basedmusiccompositionandmusicperformancetechniques.Thesoundtransformationsaredrivenbythehandgestures,whiletheuseofmotiontrackingdeviceletstheuserbuildupaspecificexperienceandvirtuosity.

Keywords:LeapMotionsensor,gesturerecognition,motiontracking,musicexpression,granularsynthesis,Max/MSP

IntroductionThenameoftheMotionOrigamiproject1isinspiredbytheJapanesearttraditionoforigamifolding.Theactualprocessofpaperfoldingisreflectedinaspecificcompositionalstrategy,whichusescapturedhandgestures.Inotherwordstheperformingartist,themusician,isableto'fold'soundswithhisownhandgesturesintonewsoundobjects.Thesimple'folds'thereforeresultintocomplexsoundscapesduringtheperformance.

1MotionOrigami;theprojectpresentationcanbeaccessedonlineathttp://www.danielbartos.com/motion-origami

Figure1.MotionOrigami–MAX/MSPpatchwiththeLeapMotion

Page 2: 65 Motion Origami DanielBartos - University of Sussexusers.sussex.ac.uk/.../Posters/65_Motion_Origami_DanielBartos.pdf · Motion Origami Daniel Bartoš1 1 FAMU, Prague, Czech Republic

ThispaperdescribesanoriginalMax/MSPpatch,whichusestheSmoothOverlappingGranularSynthesisobjectsogs~2asit'smainsoundtransformingelement.Thenoveltyofthisapproachisnotintheproductionofanewcodeoraparticularspecificdesign,butinmakingacreativeuseoftheexistingandadvancedaudioprocessingtoolsinthesoundandmusicperformance.Theprojectshowshowaliveinterfaceprototypecanbeturnedintoanoriginalcompositionaltool.Assuch,theMotionOrigamiprojectrepresentsarecipeonhowtoapproachadesignofanexperimentalmusicinstrumentandshowsanapproachsimilartotherapidprototypingtechniqueappliedtotherealmoftheadvancedMax/MSPaudioprocessingdomain.

LeapMotionTheLeapMotion3sensorwasdesignedfortouch-lesstrackingofhandgesturesandtheirmovements.ThesensorinconjunctionwiththeLeapMotionSDKbecomesasophisticatedcontrolleranddeliverscomplexinformationmonitoringthehandmovementsinreal-time.Handgesturesarecapturedwithahighprecisionaccuracyalongwiththeindividualfingerpositions,rotationsandfingertipsaccelerations.

TheLeapMotionsensorwasintroducedtothemarketinthemid2013andswiftlyfounditsplaceamongotherdevicesdesignedforbodymovementsandgesturetracking–asforexampletheKinect4andtheWiicontroller5.Suchdevicesworkusuallywithgeneralbodytrackingandcanberepurposed.TheLeapMotiononthecontraryisdesignedtocapturethehandgesturesandmovementsonly.InfacttheLeapMotionsensorcouldbethoughtofasaspecialinteractivespace6ofpredefinedvolumeofair.CommonmusicapplicationsoftheLeapMotionsensorareprimarilybasedontheimitationsofphysicalinterfacesofexistingmusicalinstruments.Thisistrueespeciallyforthefollowingprojects:Air-Keyes,CrystalPiano,AirHarp,etc.(Han2014).Themainreasonforthatis,thatthevirtualkeyboardrepresentsanidealsetupfortestingofthesystemlatency(Silva2013),asthelowlatencyresponseisoneofthemostimportantelementsforanyreal-timemusicperformanceapplication.ThelatencyoftheLeapMotion,asadvertisedbythemanufacturer,isanywherefrom5msto20ms,butthisparticularfigureobviouslydependsonthewholesystemconfigurationandcomponentsused.AnothercategoryintheexistingapplicationsoftheLeapMotionsensorisrepresentedbyvarioushybridinstruments.AspecificselectionofsuchprojectsisdescribedinthepaperLessonsLearnedinExploringLeapMotion™(Han,Gold2014).

TheMotionOrigamiBody&SoundInteractionThethemeofphysicalinteractionandsoundprocessingisthoroughlyinvestigatedinthepapercalledSoundDesignasHumanMatterInteraction(Wei2013),whereasthemostimportantkeywordbecomesthetermmaterialcomputation7.IntheextendedsensewecanthinkoftheLeapMotionsensorasifitisconstitutinganinteractivespaceonitsown,wherethecalculationsandinteractiontakeplace.Thegesturerecognitioninconjunctionwithamusicperformanceis

2MAX/MSPobjectsogs~(SmoothOverlapGranularSynthesis)byNorbertSchnell,IRCAM–CentrePompidou;moreinformationcanbeaccessedonlineathttp://forumnet.ircam.fr/product/max-sound-box3LeapMotion,completesensorandSDKspecificationcanbeaccessedonlineathttp://leapmotion.com4MicrosoftKinectisasensorusingdepthmaptocreate3Dspacerepresentation.DevelopedbyMicrosoftforagameconsoleMicrosoft© Xbox360.Kinectspecificationcanbeaccessedonlineatwww.microsoft.com/en-us/kinectforwindows5WiiRemoteispartofNintendo©WiiConsoledevelopedbyNintendo©Company,Ltd.Wiispecificationcanbeaccessedonlineatwww.nintendo.com/wii6Interactivespaceof8cubicfeet,equivalentof0,22m3respectively,asstatedbythemanufacturer;moreinformationcanbeaccessedonlineathttp://leapmotion.com7„Realtime,continuouslyresponsiveenvironmentsforthedesignoftechnicalsystemsforhuman-computerinteractiondesign…“.Ibid,p.2010.

Page 3: 65 Motion Origami DanielBartos - University of Sussexusers.sussex.ac.uk/.../Posters/65_Motion_Origami_DanielBartos.pdf · Motion Origami Daniel Bartoš1 1 FAMU, Prague, Czech Republic

alsobeingexploredbytheIRCAM'sISMMresearchteam8.Whilethephysicalbody&soundinteractionconceptispresentforexampleintheprojectsofMarcoDonnarumma,whousessetofrepurposedbio-sensors9.

InthecaseoftheMotionOrigamiMax/MSPpatch,theperformer'sphysicalgestureinteractionistheprimarysourceoftheresultingsoundtransformations.Theperformercreatesnewsoundobjectswiththecapturedhandgestures.ThehandgesturesintheMotionOrigamipatcharerecognisedintheinteractivespaceabovethesensorandareturnedintoacontrolmechanismcodedintheMax/MSPpatch.Asinglerecognisedhandgestureinitialisestheaudiobufferwiththeincomingaudio.Thebufferactsasastartingpointforthegranularsynthesissoundscapebuilding.Thehandgesturescontrolthegranularsynthesisengineparameters,alongwiththetimingofthebufferinitializationwithanewaudiomaterialduringtheliveperformanceaswell.Thehandgesturescontrolthewet&dryratiooftheaudiosignalinputandalsothemultichannelaudiodistributionviatheAmbisonicsengine10.

MotionOrigamiPatchImplementationTheMotionOrigamipatchisprogrammedinMax/MSP11.DatafromtheLeapMotionsensorarecapturedbytheSwirlyLeapobject12.TheupdatedversionofthepatchusesthecurrentandwelldocumentedIRCAM'sleapmotion

8IRCAM–ISMMteam{SOUNDMUSICMOVEMENT}INTERACTION;moreinformationcanbeaccessedonlineathttp://ismm.ircam.fr9MarcoDonnarumma;projectpresentationscanbeaccessedonlineat:www.marcodonnarumma.com10AmbisonicsToolsfromtheInstituteforComputerMusicandSoundTechnologyICST,ZurichUniversityoftheArtscanbeaccessedonlineat:https://www.zhdk.ch/index.php?id=icst_ambisonicsexternals11Max/MSPvisualprogrammingenvironmentbyCycling74,moreinformationcanbeaccessedonlineathttp://www.cycling74.com

Figure2:MotionOrigami–theindividualpatchsectionsexplained

Page 4: 65 Motion Origami DanielBartos - University of Sussexusers.sussex.ac.uk/.../Posters/65_Motion_Origami_DanielBartos.pdf · Motion Origami Daniel Bartoš1 1 FAMU, Prague, Czech Republic

object13.TheSmoothOverlappingGranularSynthesisobjectsogs~14waschosenbecauseitoffersasimpleandcreativecontrolovertheaudiocapturedintotheaudiobufferandcanbeusedforaspecificnavigationandexplorationoftheaudiobuffer.Thesogs~objectalsomimicsthepaperfoldingtechniqueinthesense,thattheoriginalpapersurfaceissubstitutedwitha2Dplainmadeoftwoindividualparameters:thegrainpositionandthegrainsize.ThedatafromtheLeapMotionsensoraremappedtodrivethesogs~objectwiththosetwoselectedparameters:theperformerthannavigatesthespaceoftheaudiobufferdefinedbythegrainsizeandthegrainpositionparametersrespectively.

Thewet&drymixratio,whichisalsomappedtohandgestures,offersdetailedcontroloverthesoundmergingandaccentstheactual'soundfolds'.These'soundfolds'canbuildupacertainlevelofcomplexitythankstothefact,thattheliveaudiosourceiscoupledwiththeaudiomaterialstoredinthebuffer.Althoughtheaudioismodifiedinthegranulationprocess,itsharesthesamespectralandtonalcharacteristicswiththeoriginalsoundsource.Thisinturncreateselaboratedsoundtransformations,whichcanbethoughofastheintroduced'soundfolding'process.Thepatchrecognisesaspecificgesture,whichisrequiredtostarttheinitialisationoftheaudiobuffer.Inthiswaytheaudiobufferisfilledwithanewaudiomaterial.Thebufferinitialisationstartswithagestureofaclosedhand.Thisgesturecanbeparaphrasedas'grabthesound'gesture.Inthisverymoment,thebufferisfilledwiththelivesoundinputandbecomesavailabletosogs~object.Subsequenthandgesturescontrolvariousaspectsofthegranularsynthesisengine:horizontalhandswipecontrolsgrainpositionselection,verticalhandmovementcontrolstimelengthofagrain.Moreovertheoverallpalmpositionabovethesensorinthex-yplanedefinesthesoundsourcepositioninthemultichannelAmbisonicsspaceandaddsamultichannelspatialisationlayertotheperformance.TheotherLeapMotionrecognisedvariablesasyaw,pitcharollarealternativelymappedtoextendedFXprocessing(reverb,distortion,etc.),dependingontheperformancescenario.

12SwirlyLeapMax/MSPaccesstheLeapMotionAPI,writtenbyTomRitchfordfromNewYorku.Projectcanbeaccessedonlineathttps://github.com/rec/swirly-leap13WelldocumentedMax/MSPobjectleapmotionbyIRCAM,moreinformationcanbeaccessedonlineathttp://ismm.ircam.fr/leapmotion/14Max/MSPobjectsogs~(SmoothOverlapGranularSynthesis)byNorbertSchnell,IRCAM–CentrePompidou,moreinformationcanbeaccessedathttp://forumnet.ircam.fr/product/max-sound-box

Figure3.MotionOrigami–therecognisedgesturesillustrated

Page 5: 65 Motion Origami DanielBartos - University of Sussexusers.sussex.ac.uk/.../Posters/65_Motion_Origami_DanielBartos.pdf · Motion Origami Daniel Bartoš1 1 FAMU, Prague, Czech Republic

Conclusions&PerformanceThemostinvitingapplicationoftheMotionOrigamipatch15isavocalperformanceorsimplyamusicinstrument,whichleavesenoughspacefortheinteractionwiththesensoritself.Thebeautyoftheliveperformanceapproachliesthefact,thattheperformercaninteractwithhis/hercapturedmusicmaterialandaddmultiplelayersofexpressionbysolelyusingthehandgestures.Newlayerofimprovisationcanbeintroduced,whilenewthemesandphrasesemerge.Theperformerthaninteractswithanewmusicmaterial,whichisbasedonthesoundqualitiesoftheoriginalmusicinstrumentorthevocalperformance.TheperformercancontrolthefollowingparametersintheMotionOrigamiliveinterface:timebasedselectionofaphrasesampledintothebuffer;grainsizeandit'spositioninthebuffer;wet&drymixratioandAmbisonicssoundsourcespaceposition(ifapplicable).

Usinggesturesinmusiccompositionandperformanceprovestobeveryintuitive.Thesensoralonehastobe'learned'tobeoperatedproperlyandthisfactdeliversaspecificvirtuosityovertime.TheLeapMotionsensorwiththeMotionOrigamipatchopensupanewexcitingfieldofmusiccompositionandsoundprocessingcoupledwithimmediategesturalinteraction.Thebiggestchallengeinthegesturebasedperformanceistherecognitionofthequantizedgestures(Potter2013).Whileparametriccontrolofthevariouspatchelementsdoesn'tpresentanytechnicalproblemtracking-wise,therecognitionofthequantizedanduniquegesturesproveddifficultthroughoutthedevelopmentphaseofthepatch.Forexample,whileplayingonavirtualkeyboard,onecanlimitthekeystrokestoaspecificscaleandlimitthemis-triggerednotesthisway.Butwhenitcomestoevokingaspecificfunctionality(samplinginitialization,soundbanktoggle,etc.)thegestureshavetoberecognizedwithexceptionallyhighprecisionasthosedecisionscreateanintegralpartoftheperformanceitself.Thisaspectoftheliveperformancegivesusaspecificconstrainsandwehavetoconsiderthemintheliveperformancescenarios,whenusingtheLeapMotionsensor.Thequalityofthetrackingdependsalsoonthepresentlightconditionsandtheoverallsensorsetup.Forexample,adirectlightreachingthesensors'ssurfacecanintroduceinconsistencyinthetracking.

15MotionOrigami,theprojectpresentationcanbeaccessedonlineathttp://www.danielbartos.com/motion-origami

Figure4:MotionOrigami–adetailofthesubpatchcontrollingtheaudiobufferinitialisation

Page 6: 65 Motion Origami DanielBartos - University of Sussexusers.sussex.ac.uk/.../Posters/65_Motion_Origami_DanielBartos.pdf · Motion Origami Daniel Bartoš1 1 FAMU, Prague, Czech Republic

Overall,theLeapMotionisverysuitableforvariousintuitivemusiccompositionandperformancescenarios.Occasionalerrorsintrackingcanbeovercomewithagoodpatchdesignandtherestricteduseofthequantizedgestures–leavingoutthequantizedgesturestobeservicesbytraditionalhardwarecontrollers.Havingsaidthat,theLeapMotionsensorexcelsintheintuitivegestureinteractionperformanceandthegesturebasedmusiccompositionstrategies.

Acknowledgements.TheMotionOrigamiprojectwasfundedbyaresearchgrantscheme“Sensorsasmusicinstruments“atHAMU,MusicAcademyinPraguerunbyMichalRataj.SpecialthanksgotoMichalRataj16andMatthewOstrowski17fortheirsupport,helpwiththepatchdesignandtheirideasaboutperformanceandmusiccomposition.

ReferencesHan,Jihyun;Gold,Nicolas.2014.“LessonsLearnedinExploringtheLeapMotion™SensorforGesture-basedInstrumentDesign,”inCaramiaux,BandTahiroğlu,KandFiebrink,RandTanaka,A(eds.).ProceedingsoftheInternationalConferenceonNewInterfacesforMusicalExpression(NIME'14)London2014,GoldsmithsUniversityofLondon:371-374.Potter,LeightElen;Araullo,Jake;CarterLewis.2013.“TheLeapMotioncontroller:Aviewonsignlanguage,”Proceedingsofthe25thAustralianComputer-HumanInteractionConference:Augmentation,Application,Innovation,CollaborationAdelaide,ACM,NewYork:175-178.

Silva,EduardoS;deAbreu,JaderAndersonO.;deAlmeida,JanielHenriqueP.;Teichrieb,Veronica;Ramalho,GeberL..2013.“APreliminaryEvaluationoftheLeapMotionSensorasControllerofNewDigitalMusicalInstruments,”ProceedingsoftheBrazilianSymposiumonComputerMusicBrasilia,SBCM:59-70.Schwarz,Diemo.2006.“Concatenativesoundsynthesis:Theearlyyears,”JournalofNewMusicResearch35(1):3–22(March).Wei,ShaXin;Freed,Adrian;Navab,Navid.2013.“SoundDesignAsHumanMatterInteraction,”ProceedingsofCHI'13ExtendedAbstractsonHumanFactorsinComputingSystemsinParis,ACM,NewYork:2009-2018.

16MichalRataj–musiccomposerandresearcheratHAMU(MusicAcademyinPrague).Moreinformationcanbeaccessedonlineathttp://michalrataj.com17MatthewOstrowski–expertonsensorsandmultimediaprogrammingbasedatHarvestworksNYC,offeredvaluableinsightsintoLeapMotionprogramming.Moreinformationcanbeaccessedonlineathttp://www.ostrowski.infoandhttp://www.harvestworks.org