To evoke a place illusion, virtual reality builds upon the integration of coherent sensory information from multiple modalities. This integrative view of perception could be contradicted when quality evaluation of virtual reality is divided into multiple uni-modal tests. We show the type and cross-modal consistency of visual content to affect overall audio quality in a six-degrees-of-freedom virtual environment with expert and naïve participants. The effect is observed both in their movement patterns and direct quality scores given to three real-time binaural audio rendering technologies. Our experiments show that the visual content has a statistically signi?cant effect on the perceived audio quality.
Authors:
Rummukainen, Olli; Wang, Jing; Li, Zhitong; Robotham, Thomas; Yan, Zhaoyu; Li, Zhuoran; Xie, Xiang; Nagel, Frederik; Habets, Emanuël A. P.
Affiliations:
International Audio Laboratories Erlangen, Erlangen, Germany; Beijing Institute of Technology, Beijing, China(See document for exact affiliation information.)
AES Convention:
145 (October 2018)
Paper Number:
10128
Publication Date:
October 7, 2018
Subject:
Spatial Audio-Part 2 (Evaluation)
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.