Spatial composition represents a key aspect of contemporary acousmatic and computer music. The history of spatial composition practice has shown many different approaches of composition and performance tools, instruments and interfaces. Furthermore, current developments and the increasing availability of virtual/augmented reality systems (XR) extend the possibilities in terms of sound rendering engines as well as environments and tools for creation and experience. In contrast to systems controlling parameters of simulated sound fields and virtual sound sources, we present an approach of XR-based and real-time body-controlled (motion and biofeedback sensors) sound field manipulation in the spatial domain. The approach can be applied not only to simulated sound fields but also to recorded ones and reproduced with various spatial rendering procedures.
Authors:
Dziwis, Damian Thomas; Lübeck, Tim; Pörschmann, Christoph
Affiliation:
University of Applied Sciences Cologne
AES Convention:
148 (May 2020)
Paper Number:
10358
Publication Date:
May 28, 2020
Subject:
Rec/Pro/Edu
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can
subscribe to this RSS feed.
Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.