Recent developments have led to the availability of consumer devices capable of recognising certain human movements and gestures. This paper is a study of novel gesture-based audio interfaces. The authors present two prototypes for interacting with audio/visual experiences. The first allows a user to ‘conduct’ a recording of an orchestral performance, controlling the tempo and dynamic. The paper describes the audio and visual capture of the orchestra and the design and construction of the audio-visual playback system. An analysis of this prototype, based on testing and feedback from a number of users, is also provided. The second prototype uses the gesture tracking algorithm to control a three-dimensional audio panner. This audio panner is tested and feedback from a number of professional engineers is analysed.
Authors:
Churnside, Anthony; Pike, Chris; Leonard, Max
Affiliation:
BBC R&D, Media City, Salford, UK
AES Convention:
131 (October 2011)
Paper Number:
8496
Publication Date:
October 19, 2011
Subject:
Applications in Audio
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.