The concept of augmented reality audio characterizes techniques where a real sound environment is extended with virtual auditory environments and communications scenarios. A framework is introduced for mobile augmented reality audio (MARA) based on a specific headset configuration where binaural microphone elements are integrated into stereo earphones. When microphone signals are routed directly to the earphones, a user is exposed to a pseudoacoustic representation of the real environment. Virtual sound events are then mixed with microphone signals to produce a hybrid, an augmented reality audio representation, for the user. An overview of related technology, literature, and application scenarios is provided. Listening test results with a prototype system show that the proposed system has interesting properties. For example, in some cases listeners found it very difficult to determine which sound sources in an augmented reality audio representation are real and which are virtual.
Authors:
Härmä, Aki; Jakka, Julia; Tikander, Miikka; Karjalainen, Matti; Lokki, Tapio; Hiipakka, Jarmo; Lorho, Gaëtan
Affiliations:
Helsinki University of Technology, Finland; Nokia Research Center, NOKIA GROUP, Finland(See document for exact affiliation information.)
JAES Volume 52 Issue 6 pp. 618-639; June 2004
Publication Date:
June 15, 2004
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can
subscribe to this RSS feed.
Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.