We present an acoustic navigation experiment in virtual reality (VR), where participants were asked to locate and navigate towards an acoustic source within an environment of complex geometry using only acoustic cues. We implemented a procedural generator of complex scenes, capable of creating environments of arbitrary dimensions, multiple rooms, and custom frequency dependent acoustic properties of the surface materials. For the generation of the audio we used a real-time dynamic sound propagation engine which produces spatialized audio with reverberation by means of bi-directional path tracing (BDPT) and is capable of modeling acoustic absorption, transmission, scattering, and diffraction. This framework enables the investigation of the impact of various simulation properties on the ability of navigating a virtual environment. To validate the framework we conducted a pilot experiment with 10 subject in 30 environments and studied the influence of diffraction modeling on navigation by comparing their navigation performance in conditions with and without diffraction. The results suggest that listeners are successfully able to navigate VR environments using only acoustic cues. In the studied cases we did not observe a significant effect of diffraction on navigation performance. A significant amount of participants reported strong motion sickness effects, which highlights the ongoing issues of locomotion in VR.
Amengual Garí, Sebastià V; Calamia, Paul; Robinson, Philip
Affiliations: Reality Labs Research, Meta, Redmond, USA; Reality Labs Research, Meta, Redmond, USA, Reality Labs Research, Meta, Redmond, USA(See document for exact affiliation information.)
AES Convention: 154 (May 2023) Paper Number: 10638
Publication Date: May 13, 2023
Download Now (8.9 MB)
No AES members have commented on this VR/AR yet.
If you are not yet an AES member and have something important to say about this VR/AR then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.