Object-based audio has great capacity for production and delivery of adaptive and personalizable content. This can be used to improve the accessibility of complex content for listeners with hearing impairments. An adaptive object-based audio system was used to make mix changes enabling listeners to balance narrative comprehension against immersion using a single dial. Performance was evaluated by focus groups of 12 hearing impaired participants who gave primarily positive feedback. An experienced sound designer also evaluated the function of the control and process for authoring the necessary metadata establishing that the control facilitated a clearer narrative while maintaining mix quality. In future work the algorithm, production tools, and interface will be refined based on the feedback received.
Ward, Lauren; Shirley, Ben; Francombe, Jon
Affiliations: University of Salford, Salford, UK; BBC Research and Development, Salford, UK(See document for exact affiliation information.)
AES Convention: 145 (October 2018) eBrief:478
Publication Date: October 7, 2018
Subject: Spatial Audio
No AES members have commented on this paper yet.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.