One of the requirements for a fully immersive gaming experience is the correct and visually coherent reproduction of location of sounding objects. Limitations of human spatial resolution in this regard have been known for decades and are commonly taken into account when designing audio reproduction systems. However, mechanisms responsible for the perception of static and dynamic sound sources are not the same and less attention has been devoted to the problem of perception of dynamically relocated sound sources, omnipresent in video games. This paper explores the human ability to follow moving sound sources when presented as First and Higher Order Ambisonic renderings over headphones and aims at finding the optimal, psychoacoustically justified parameters that could significantly reduce computational requirements of audio engines.
Authors:
Gorzel, Marcin; Kearney, Gavin; Rice, Henry; Boland, Frank
Affiliation:
Trinity College Dublin, Dublin, Ireland
AES Conference:
41st International Conference: Audio for Games (February 2011)
Paper Number:
P2-1
Publication Date:
February 2, 2011
Subject:
Audio for Games
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.