This paper describes a method for creating multi-user shared augmented reality audio spaces. By using a system of infrared cameras and motion capture software, it is possible to provide accurate low-latency head tracking for many users simultaneously, and stream binaural audio representing a realistic, shared virtual environment to each user. Participants can thus occupy and navigate a shared virtual aural space without the use of head-mounted displays, only headphones (with passive markers affixed) connected to lightweight in-ear monitor beltpacks. Potential applications include installation work, classroom use, and museum audio tours.
Author:
Costagliola, Michael
Affiliation:
Yale University, New Haven, CT, USA
AES Conference:
2018 AES International Conference on Audio for Virtual and Augmented Reality (August 2018)
Paper Number:
P1-1
Publication Date:
August 11, 2018
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.