Sonification of objects is an important research field that allows us to present graphical forms to people who have low vision, or who are blind. This paper presents some preliminary results of the study an approach to the problem of objective recording of subjective sound image. Sonification of the same graphic objects was based on amplitude panning method through three different sound mappings. Visualization of auditory localization strategies was carried out through SMI EyeLink Gaze Tracking system. The evaluation showed that audio stimuli cause a smaller irritation than cognitive control of eye behavior in absence of visual stimulus. Therefore, we suggest that eye tracking cannot directly be used as a criterion of sound mapping efficiency.
Authors:
Evreinov, Grigori Evgenievich; Raisamo, Roope Samuli
Affiliation:
TAUCHI Computer-Human Interaction Unit, Dept. of Computer and Information Sciences, University of Tampere, Finland
AES Conference:
22nd International Conference: Virtual, Synthetic, and Entertainment Audio (June 2002)
Paper Number:
000206
Publication Date:
June 1, 2002
Subject:
Virtual, Synthetic and Entertainment Audio
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.