Community

AES Conference Papers Forum

Using texture maps to procedurally generate sound in virtual environments

Document Thumbnail

Audiovisual occurrences in virtual environments are governed by data streams that are often shared, but processed separately by the graphics and audio engines. In a common video game scenario, virtual physics interactions among objects in the scene project their visual effect through animated graphics rendering. Independently to this process, the same interactions trigger and control the corresponding sonic output. However, in the natural world, this group of events is a unified causal phenomenon. In an attempt to model audiovisual phenomena within virtual worlds more thoroughly, the use of texture maps for sound effects generation is investigated. Wavetable synthesis is employed for this purpose, as it features certain characteristics that facilitate intuitive image to sound translation. This approach aims to take advantage of the cross-modal affordances of sonification, the realism of physically inspired sound synthesis and the dynamicism of generative audio.

Author:
Affiliation:
AES Conference:
Paper Number:
Publication Date:
Subject:

Click to purchase paper as a non-member or you can login as an AES member to see more options.

No AES members have commented on this paper yet.

Subscribe to this discussion

RSS Feed To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.

Start a discussion!

If you would like to start a discussion about this paper and are an AES member then you can login here:
Username:
Password:

If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.

AES - Audio Engineering Society