Community

AES Conference Papers Forum

Multimodal Exploration of Virtual Objects with a Spatialized Anchor Sound

Document Thumbnail

A multimodal interactive system for audio-haptic integration is presented in this paper. Preliminary subjective tests with a virtual reality setup were conducted with the goal of interpreting cognitive mechanisms and improving performances in orientation & mobility protocols for visually impaired subjects, where spatial representations need to be developed using residual sensory channels. An object recognition experiment was performed in order to investigate the contribution of dynamic spatial audio cues when integrated with haptic feedback. Audio cues took the form of anchor sound delivered through headphones using customized Head-related Transfer Functions (HRTFs). This setup was employed in the exploration of simplified virtual audio-tactile environments. Overall results on recognition time reveal a relationship between anchor position and object shape. Moreover, a qualitative analysis of the exploration paths highlights behavioral changes between unimodal and multimodal conditions.

Authors:
Affiliations:
AES Conference:
Paper Number:
Publication Date:
Subject:

Click to purchase paper as a non-member or you can login as an AES member to see more options.

No AES members have commented on this paper yet.

Subscribe to this discussion

RSS Feed To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.

Start a discussion!

If you would like to start a discussion about this paper and are an AES member then you can login here:
Username:
Password:

If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.

AES - Audio Engineering Society