Top Banner
University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE) Michael Quinn (ECE)
23

University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

Dec 13, 2015

Download

Documents

Amanda McCoy
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

An Integrated System of 3D Motion Tracker and Spatialized

Sound Synthesizer

John Thompson (Music)

Mary Li (ECE)

Michael Quinn (ECE)

Page 2: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

Goal

• Develop an interactive music synthesis system while exploring tracking and surveillance technologies, spatial music composition strategies, and sound synthesis techniques

Page 3: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

Hardware & Software

• Unibrain Fire-I Cameras

• PC Running Windows XP

• Apple Powerbook and G5

• Intel’s OpenCV Libraries

• Max/MSP/Jitter

• SuperCollider

Page 4: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

Project Summary

• 2D Tracking• Camera Calibration• 3D Position Calculation• Composition and Sound Synthesis

Page 5: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

2D Tracking – Temporal Difference• Temporal Difference

– Subtract previous frame from the current frame to see what has changed.

Page 6: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

2D Tracking - Background Subtraction• Develop a background model

• Subtract background from current frame

• Objects not in model will show up in the difference

Page 7: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

2D Tracking - Thresholding

• Values chosen based on variance of background model

Page 8: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

2D Tracking – Center of Mass

• For now, we assume that only one object is being tracked. Thus, the image center of mass approximates the object center of mass.

• Center of Mass is then sent to the 3D section.

Page 9: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

Camera Calibration

• Purpose– A preparation for 3D estimation from 2D

images

• Methods– Matlab camera calibration toolbox– Intel OpenCV calibration functions

Page 10: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

Camera Calibration -- intrinsic parameters

•Focal lengths: fx, fy

•Principal points: px, py

•Distortions: radial and tangential distortion coefficients

DirectShow Filter runs under MS Windows

Page 11: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

Camera Calibration -- intrinsic parameters•Defines pixel coordinate points with respect to camera coordinate system

Ximage = Mintr Xcamera

Matlab Camera Calibration Toolbox

Page 12: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

Camera Calibration -- extrinsic parameters•Defines camera coordinate points with respect to world coordinate system Xcamera = Mextr Xworld

OpenCV calibration routine (based on intrinsic parameters)

left camera view center camera view right camera view

Page 13: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

3D Tracking -- methods

• Obtain 2D motion centroid information

•Epipolar Geometry

•Least Square

Page 14: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

3D Tracking -- results

floor plan of visible space 18-pt tracking example

X_world Y_world Z_world

Page 15: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

Tracking System Performance

•Realtime average

2.09 frames per second

• System performance can be improved by

1. Distributed computing: one PC for each camera

2. More cameras

3. Improve background segmentation

Page 16: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

TransMedia Systems

Trans-media systems exist as independent engines behind artistic manifestations in diverse media.

Input:In the case of our Motion Tracking System project, the implementation of a Motion Tracking algorithms tracking objects within a sensor space serves as a principle component to power the trans-media system.

Transformation:In the middle layer, the data from the Motion Tracking system is interpreted and labled. This data is then used to determine the activity and state of the sensor space.

Output:In the final stage of the trans-media system, specific media, such as sound, use the middle layer data to inform their processes. The sound is projected into the sensor space. Interactivity is enhanced when the participants in the sensor space become aware of their relationship with the system.

Graphic Notations and Trans-media Systems: John Cage “Fontana Mix”

Page 17: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

Spatial Composition Strategies -- Sonic Nodes

• A system of nodes are layed out in the virtual space. The system of nodes is comprised of Generative Nodes and Transformative Nodes

• The nodes have an activation space surrounding them. Tracked objects activate nodes at various levels depending on the tracked objects’ measured distance from the nodes’ center.

• The nodes are in flux and adjust their positions over time to reflect the history of the space

Page 18: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

When a tracked object moves within the activation space of a particular node, the node executes its action.

Figure 1

Nodes with various musical functions are represented by

colored circles. Different paths create unique realizations of

phrase level material in the mobile form

Chord Nodes 34

Impulse Nodes 16

Sample Playback Nodes

64

Convolution Nodes

360

Pitch-Time Shift Nodes

50

Page 19: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

Pitch sets in the chord nodes

[0 1 3 4][0 1 5 7][0 2 3 7][0 2 4 6][0 2 4 7][0 2 4 8][0 2 5 6][0 2 5 7][0 2 5 8][0 2 6 7][0 2 6 8][0 3 5 6][0 3 5 7][0 4 5 8]

Thirty-four chordNodes are scattered in the virtual space. Seventeen of the chordNodes contain a unique four note pitch set.

Fourteen of the Seventeen sets are unique in their normal order. Although the pitch sets are diverse, they are closely knitted in their makeup. This lends a unified quality to the pitched verticalities of the sonic space.

As multiple users move throughout the space, the sonic material subtlely shifts, melding the space into a cohesive flow.

Tracked objects leave histories of their paths in the space, the pitchSets transform in response. The space adapts its pitched contents to the actions of the users of the space.

Page 20: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

Sound Spatialization

The system outputs quadraphonic audio

distributed to speakers surrounding the sensor

space. The position of the sounds within the

sensor space is determined by the position of

the tracked object. (Figure 1)

Distance is simulated through direct sound

to reverberant sound mixture. This ratio is

dictated by the following formulas:

Direct Sound Amplitude1/zPosition.abs

Reverb Sound Amplitude = 1/zPosition.abs.sqrt

Figure 1

Sensor Space

Speakers

Object Position

Page 21: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

Future Work

• Improve the system by enabling the tracking of multiple objects as well as incorporating features such as shape, size, and color.

• Improve integration with the musical synthesis system.

Page 22: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

• Professor B.S. Manjunath

• Professor G. Legrady

• Professor J. Kuchera-Morin

• NSF IGERT Program

• Fellow IGERTers

Special Thanks

Page 23: University of California, Santa Barbara An Integrated System of 3D Motion Tracker and Spatialized Sound Synthesizer John Thompson (Music) Mary Li (ECE)

University of California, Santa Barbara

Q ??