Spatialised auditory and visual cues were delivered via a wearable interface – Google Glass – and a Bone Conduction Headset to aid a search task. The aim of the study was determine which of the cues – auditory, visual or a combination of the two – would lead a user to a target in the shortest time with a minimal demand on the user’s attention. The results demonstrate that the static visual cue performed the best. The static auditory cue displayed a good level of usability and intuitiveness, especially when no visual cue was provided alongside. Our findings demonstrate that there is significant value in providing auditory or visual cues to aid a search task without inhibiting environmental awareness.
Authors:
Barde, Amit; Ward, Matt; Lindeman, Robert; Billinghurst, Mark
Affiliations:
Auckland Bioengineering Institute, University of Auckland; University of Canterbury; HIT Lab NZ; University of South Australia(See document for exact affiliation information.)
AES Conference:
2020 AES International Conference on Audio for Virtual and Augmented Reality (August 2020)
Paper Number:
2-5
Publication Date:
August 13, 2020
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.