This work shows a detailed application of an optical tracking system to control the positioning of sound sources in an object-based audio reproduction system for live sound reinforcement. This need is brought up by live performances with moving actors like operas, musicals or spoken theater. With state-of-the-art object-based audio reproduction systems it is possible to distribute virtual sound sources for improved sound localization within the audience area. To cope with applications of high complexity automated auxiliary systems like motion tracking provide valuable control data and thus enhance the usability of such systems. The presented approach shows a solution with focus on interfaces between systems and devices.
Authors:
Seideneck, Mario; Bergner, Jakob; Sladeczek, Christoph
Affiliation:
Fraunhofer Institute for Digital Media Technology (IDMT), Ilmenau, Germany
AES Convention:
142 (May 2017)
eBrief:349
Publication Date:
May 11, 2017
Subject:
Spatial Audio, Listening Tests, Systems
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can
subscribe to this RSS feed.
Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.