Top Banner
U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research
14

U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

Dec 22, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

U.S. Army Research, Development and Engineering Command

Braxton B. Boren, Mark EricsonNov. 1, 2011

Motion Simulation in the Environment for Auditory Research

Page 2: U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

Introduction

• ARL’s Environment for Auditory Research (EAR) contains state-of-the-art facilities for auditory simulations

• Realistic auditory environments should contain both static and moving sources

• Moving sources are much more difficult to simulate in a 57-channel audio system

– Multichannel audio editors

– Max/MSP

– Matlab

• Using streaming audio buffers, the EAR’s Sphere Room has been equipped to simulate moving sources by automatically generating source paths and processing each source’s motion in real time.

Page 3: U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

ENVIRONMENT FOR AUDITORY RESEARCHENVIRONMENT FOR AUDITORY RESEARCH

Sphere RoomSphere Room

The Sphere Room is a 140 m3 (5.3m × 5.4m × 4.9m) auditory

virtual reality space designed to facilitate investigations of:

-Integrity of auditory virtual spaces

-Realism of complex auditory simulations

-Effects of changes in Head-Related Transfer Functions on

auditory perception

-Effect of helmets and other headgear on spatial orientation

The room contains 57 loudspeakers separated vertically by 30°, constituting a sphere surrounding the listener. This

configuration of loudspeakers enables virtual sound source movement and sound projection in an almost 360° sphere.

Unlimited stationary or moving sound sources may be presented to any combination of the 57 loudspeakers permitting

generation of realistic and dynamically changing acoustic environments

Page 4: U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

Streaming Audio in Matlab

• PortAudio API: allows low-level control of multichannel audio devices through the Matlab programming environment

- Low system latency

- High audio fidelity

• Streaming Audio: short buffers update every 11.5 milliseconds with new audio data

- Loudspeaker gains are pre-calculated

- Signal processing can be enacted in real time

• Static sources can be placed in background at specific positions

• Additional moving sources can be added in a single virtual environment

Page 5: U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

Source Motion Paths

• Virtual source motion paths are defined parametrically, over time

- Circular

- Elliptical

- ‘Dogbone’

Page 6: U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

Panning Algorithms

• Distance-Based Amplitude Panning (DBAP), Lossius et al., 2009

– loudspeaker gains are determined by each speaker’s distance from virtual audio source

– independent of listener position

– provides smooth motion panning for virtual sources located on loudspeaker array

– cannot simulate sources outside array

Page 7: U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

Panning Algorithms

• Vector Base Amplitude Panning (VBAP), Pulkki, 1997

– Defines each loudspeaker as a position vector

– Given a set of three linearly independent speaker vectors, VBAP can simulate a source within the speaker triangle as a linear combination of the three vectors

– The coefficient of each vector is the gain of the corresponding speaker

Page 8: U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

Panning Algorithms

• Vector Base Amplitude Panning (VBAP), Pulkki, 1997

– VBAP is more robust than DBAP given a fixed listener position

– Allows efficient simulation of virtual sources outside the loudspeaker array

– Requires an algorithm for detecting vector/triangle intersections

Page 9: U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

Assigning triangles to sources

1) Parametrically define the triangle’s plane:

2) If a given ray intersects the plane, find its parametric coordinates s and t

3) If (s + t) is between 0 and 1, the ray intersects the triangle

Ray-Triangle Intersection,

Sunday, 2003vtusVtsV 0),(

Page 10: U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

Assigning triangles to sources

Ray-Triangle Intersection,

Sunday, 2003

• Requires 5 distinct dot product operations

• Not as efficient as other algorithms for dynamic environments

• But it’s more efficient for static sets of triangles because the planes’ normal vectors can be pre-computed

Page 11: U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

Signal Processing

• High-quality vehicular recordings are available with included x-y-z coordinates

– These already contain attenuation, air absorption, and Doppler shift

– Position data can be read and interpolated to determine pan positions

• To allow arbitrary movement of any signal, signal processing is added

– Attenuation and air absorption coefficients are pre-computed

– Signal gain and one-pole filter are updated in real time before loading the streaming audio buffer

– Doppler shift is more computationally expensive

• Matlab isn’t fast enough to do it in real time

• May later be implemented from C++

• Need better constant recordings of vehicular motion

– Current recordings of vehicles idling are unconvincing

– Not as important for slower sources that don’t change with velocity

Page 12: U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

Discussion

• With a full signal processing load, this system can process up to four independent moving sources at once

• Pre-calculations can take longer if different sources’ velocities and path lengths have very high least-common multiples

• Static sources can be added in specific channels to add background ambience

Page 13: U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

Conclusions

• Real time streaming audio allows simulated motion of any parametric path

• Two different panning algorithms have been implemented

– DBAP is simpler and better for listener-independent reproduction

– VBAP is more robust and better for a fixed listener position

• Attenuation and air absorption filtering can be applied in real time to give more realistic distance cues

• This system will be used in a series of auditory simulations and experiments ongoing at EAR

Page 14: U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.

References

• Henry, P., Amrein, B., & Ericson, M., “The Environment for Auditory Research”, Acoustics Today, 5(3), 2009.

• Kleiner M., Brainard D., & Pelli D., “What's new in Psychtoolbox-3?”, Perception 36 ECVP Abstract Supplement, 2007.

• Lossius, T. Baltazar, P, & de la Hogue, T., “DBAP - Distance-Based Amplitude Panning”, Proceedings of the 2009 International Computer Music Conference, 2009.

• Pulkki, V., “Virtual Sound Source Positioning Using Vector Base Amplitude Panning”, J. Audio Eng. Soc., 45, 1997.

• Sunday, D., “Intersections of Rays, Segments, Planes and Triangles in 3D”, 2003.