Recordings of historical live music performances often exist in several versions, either recorded from the mixing desk, on stage, or by audience members. These recordings highlight different aspects of the performance, but they also typically vary in recording quality, playback speed, and segmentation. We present a system that automatically aligns and clusters live music recordings based on various audio characteristics and editorial metadata. The system creates an immersive virtual space that can be imported into a multichannel web or mobile application allowing listeners to navigate the space using interface controls or mobile device sensors. We evaluate our system with recordings of different lineages from the Live Music Archive’s Grateful Dead collection.
Authors:
Wilmering, Thomas; Thalmann, Florian; Sandler, Mark B.
Affiliation:
Queen Mary University of London, London, UK
AES Convention:
141 (September 2016)
Paper Number:
9614
Publication Date:
September 20, 2016
Subject:
Spatial Audio: Production
Download Now (462 KB)
This paper is Open Access which means you can download it for free.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can
subscribe to this RSS feed.
Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.