Head tracking combined with head movements have been shown to improve auditory externalization of a virtual sound source and contribute to the performance in localization. With certain technically constrained head-tracking algorithms, as can be found in wearable devices, artefacts can be encountered. Typical artefacts could consist of an estimation mismatch or a tracking latency. The experiments reported in this article aim to evaluate the effect of such artefacts on the spatial perception of a non-individualized binaural synthesis algorithm. The first experiment focused on auditory externalization of a frontal source while the listener was performing a large head movement. The results showed that a degraded head tracking combined with head movement yields a higher degree of externalization compared to head movements with no head tracking. This suggests that the listeners could still take advantage of spatial cues provided by the head movement. The second experiment consisted of a localization task in azimuth with the same simulated head-tracking artefacts. The results showed that a large latency (400 ms) did not affect the ability of the listeners to locate virtual sound sources compared to a reference headtracking. However, the estimation mismatch artefact reduced the localization performance in azimuth.
Authors:
Grimaldi, Vincent; S.R. Simon, Laurent; Courtois, Gilles; Lissek, Hervé
Affiliations:
LTS2 - Groupe Acoustique, Ecole Polytechnique Fèdèrale de Lausanne (EPFL), Lausanne, Switzerland; Sonova AG, Stäfä, Switzerland; Sonova AG, Stäfä, Switzerland; LTS2 - Groupe Acoustique, Ecole Polytechnique Fèdèrale de Lausanne (EPFL), Lausanne, Switzerland(See document for exact affiliation information.)
JAES Volume 71 Issue 10 pp. 650-663; October 2023
Publication Date:
October 10, 2023
Download Now (679 KB)
This paper is Open Access which means you can download it for free.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.