AES Engineering Briefs Forum

Quantifying Localization Potential using Interaural Transfer Function

Document Thumbnail

The ability to generate appropriate auditory localization cues is an important requisite of spatial audio rendering technology that contributes to the plausibility of virtual sounds presented to a user, especially in XR applications (VR/AR/MR). Algorithmic approaches have been proposed to quantify such technologies’ ability to reproduce interaural level difference (ILD) cues through regression and statistical methods, providing a useful standardization and automation method to estimate the localization accuracy potential of a given spatial audio rendering engine. Previous approaches are extended to include interaural time difference (ITD) cues as part the perceptual transform through the use of the interaural transfer function (ITF). The extended algorithmic approach of quantifying localization accuracy may provide an adequate substitute for critical listening studies as an evaluation method. However, this approach has not yet been validated through comparison with localization listening studies. A review of listening tests are reviewed in conclusion to increase confidence in presented methods of algorithmically quantifying localization accuracy potential of a spatial audio rendering engine.

AES Convention: eBrief:
Publication Date:

Click to purchase paper as a non-member or you can login as an AES member to see more options.

No AES members have commented on this paper yet.

Subscribe to this discussion

RSS Feed To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.

Start a discussion!

If you would like to start a discussion about this paper and are an AES member then you can login here:

If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.

AES - Audio Engineering Society