This paper describes an application that provides real-time high accuracy room acoustics simulation. Using a multi-touch interface, the user can easily manipulate the dimensions of a virtual space while hearing the room’s acoustics change in real-time. Such an interface enables a more fluid and intuitive control of our application, which better lends itself to expressive artistic gestures for use in such activities as sound design, performance, and education. The system relies on high accuracy room impulse responses from CATT-Acoustic and real-time audio processing through Max/MSP and provides holistic control of a spatial environment rather than applying generic reverberation via individual acoustic parameters (i.e. early reflections, RT60, etc.). Such an interface has the capability to create a more realistic effect without compromising flexibility of use.
Authors:
Madden, Andrew; Blumental, Pia; Andreopoulou, Areti; Boren, Braxton; Hu, Shengfeng; Shi, Zhengshan; Roginska, Agnieszka
Affiliation:
New York University, New York, NY, USA
AES Convention:
131 (October 2011)
Paper Number:
8577
Publication Date:
October 19, 2011
Subject:
Posters: Spatial Audio Processing
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.