To achieve the goal of a perceptual fusion between the auralization of virtual audio objects in the room acoustics of a real listening room, an adequate adaptation of the virtual acoustics to the real room acoustics is necessary. The challenges are to describe the acoustics of different rooms by suitable parameters, to classify different rooms, and to evoke a similar auditory perception between acoustically similar rooms. An approach is presented to classify rooms based on measured BRIRs using statistical methods and to select best match BRIRs from the dataset to auralize audio objects in a new room. The results show that it is possible to separate rooms based on their room acoustic properties, that the separation also corresponds to a large extent to the perceptual distance between rooms, and that a selection of best match BRIRs is possible.
Authors:
Treybig, Lukas; Saini, Shivam; Werner, Stephan; Sloma, Ulrike; Peissig, Jürgen
Affiliations:
Technische Universität Ilmenau, Ilmenau, Germany; Huawei, European Research Center, Münich, Germany; Leibnitz Universität Hannover, Hanover, Germany(See document for exact affiliation information.)
AES Conference:
2022 AES International Conference on Audio for Virtual and Augmented Reality (August 2022)
Paper Number:
6
Publication Date:
August 15, 2022
Subject:
Paper
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can
subscribe to this RSS feed.
Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.