Computer-controlled sound systems have been among the most active research topics within the AES community. We are entering an age of interoperability, in which devices and the human operator work as a cohesive team. This paper examines human-machine interface issues pertaining to computer-controlled sound systems. Traditional human interfaces are analyzed, and found inappropriate for computer-controlled equipment. New human interface technologies are presented, such as spatial position tracking, eye tracking, tactile feedback, and head-mounted displays. This paper describes how these technologies work and their applications in sound system control and music performance.
Authors:
Rosenberg, Craig; Moses, Bob
Affiliations:
University of Washington, Seattle, WA ; Rane Corporation, Mukilteo, WA(See document for exact affiliation information.)
AES Convention:
95 (October 1993)
Paper Number:
3735
Publication Date:
October 1, 1993
Subject:
Multimedia
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.