This paper introduces a workflow to record paired motion capture and sound of musical performances using an infrared light-based tracking system. This paper discusses example production cases ranging from short loops to full performances, addressing the production challenges and compromises, substantially different from a regular audio recording process. Motion capture is very versatile and can be applied in many disciplines including musical pedagogy, distributed performances, machine learning, and as creative assets. Paired audio-mocap data can be rendered into similar visual content for interactive virtual environments, live performances or fixed media. This extensible workflow can be adapted and extended for different purposes in musical performances to promote further research and creative content development.
Authors:
Bui, Cindy; Genovese, Andrea; Bradley, Trey; Roginska, Agnieszka
Affiliation:
New York University - Music and Audio Research Lab, NY, USA
AES Convention:
149 (October 2020)
eBrief:635
Publication Date:
October 22, 2020
Subject:
Immersive Audio
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.