Sound reproduction is a task where the goal is to accurately reproduce sound field of a previously captured audio scene in a defined controlled receiving space. According to the current state-of-the-art technology there is no system available which can perform such task perfectly. Unrestricted reproduction of a three-dimensional sound field would require a spatially continuous sound source, therefore any real sound field reproduction system, consisting of discrete sound sources, offers only an approximation of the original sound field. Each system can be objectively evaluated against various aspects of sound field reproduction accuracy like spectral and level matching, quality and distortions, time, and directivity cues, "sweet spot" size, the influence of obstacles in the receiving space (like equipment or people) or using perceptual objective metrics. The goal of this work is to compile these evaluation methods and provide mapping to the abovementioned key aspects of sound field reproduction. The authors contextualize the overview in the acoustic testing perspective, where specific aspects of a sound reproduction accuracy matter in evaluation of different audio-related feature of a device (e.g., for testing quality of Direction of Arrival – directivity ques are critical for evaluation, whereas for testing Acoustical Scene Classification – spectral and level accuracy is more relevant) but the evaluation methods and findings can be translated into other areas of audio research.
Banas, Jan; Grzywa, Michal; Jezierski, Ryszard; Klinke, Piotr; Koszewski, Damian; Kuklinowski, Maciej; Maziewski, Przemyslaw; Pach, Pawel; Stanczak, Dominik; Trella, Pawel
Affiliation: Intel Technology Poland, Gdansk, Poland
AES Convention: 152 (May 2022) eBrief:671
Publication Date: May 2, 2022
Subject: Spatial Audio
No AES members have commented on this paper yet.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.