Community

AES Convention Papers Forum

Adaptive Synthesis of Immersive Audio Rendering Filters

Document Thumbnail

One of the key limitations in spatial audio rendering over loudspeakers is the degradation that occurs as the listener's head moves away from the intended sweet spot. In this paper, we propose a method for designing immersive audio rendering filters using adaptive synthesis methods that can update the filter coefficients in real time. These methods can be combined with a head tracking system to compensate for changes in the listener's head position. The rendering filter's weight vectors are synthesized in the frequency domain using magnitude and phase interpolation in frequency sub-bands.

Authors:
Affiliation:
AES Convention: Paper Number:
Publication Date:
Subject:

Click to purchase paper as a non-member or you can login as an AES member to see more options.

No AES members have commented on this paper yet.

Subscribe to this discussion

RSS Feed To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.

Start a discussion!

If you would like to start a discussion about this paper and are an AES member then you can login here:
Username:
Password:

If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.

AES - Audio Engineering Society