Our paper presents ideas raised by recent projects exploring the embellishment, augmentation, and extension of environmental cues, spatial mapping, and immersive potential of scalable multichannel audio systems for virtual and augmented reality. Moving beyond issues of reproductive veracity raised by merely recreating the soundscape of the physical world, these works exploit characteristics of the natural world to accomplish creative goals that include the development of models for interactive composition, composing with physical and abstract spatial gestures, and linking sound and image. We are presenting a novel system that allows the user to treat the soundfield as a fundamental building block for spatial music composition and sound design.
Authors:
Graham, Richard; Cluett, Seth
Affiliation:
Stevens Institute of Technology, Hoboken, NJ, USA
AES Conference:
2016 AES International Conference on Audio for Virtual and Augmented Reality (September 2016)
Paper Number:
7-3
Publication Date:
September 21, 2016
Subject:
Music for VR/AR Projects
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can
subscribe to this RSS feed.
Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.