The use of visual user interfaces in smartphones and other personal media devices (PMD) leads to decreased situational awareness, for example, in city traffic. It is proposed in the paper that many menu navigation functions in PMDs can be replaced by an eyes-free auditory interface and an input device based on acoustic recognition of tactile gestures. We demonstrate, using a novel experimental setup, that the use of the proposed auditory interface reduces the reaction times to external events in comparison to a visual UI. In addition, while the task completion times in menu navigation are somewhat increased in the auditory interface the subjects were able to complete the given interaction tasks correctly within a reasonable time.
Authors:
Svedström, Thomas; Härmä, Aki
Affiliations:
Aalto University, Espoo, Finland; Philips Research, Eindhoven, The Netherlands(See document for exact affiliation information.)
AES Convention:
136 (April 2014)
Paper Number:
9090
Publication Date:
April 25, 2014
Subject:
Applications in Audio/Education/Forensics
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.