Community

AES Journal Forum

Toward Emotionally-Congruent Dynamic Soundtrack Generation

Document Thumbnail

Real-time control of the emotional content of sound has utility in video game soundtracking where the player controls the narrative trajectory, and the affective attributes of the sound should ideally match this trajectory. Perceived emotions can be represented in a 2-dimensional space composed of valence (positivity, e.g. happy, sad, fearful) and arousal (intensity, e.g. mild vs strong). This report is a speculative exploration of measuring and manipulating sound effects to achieve emotional congruence. An initial study suggests that timbral features can exert an influence on the perceived emotional response of a listener. A panel of listeners responded to stimuli in a set with varying timbres, while maintaining pitch, loudness, and other musical and acoustic features such as key, melodic contour, rhythm and meter, reverberant environment etc. The long term goal is to create an automated system that utilizes timbre morphing in real time to manipulate perceived affect in soundtrack generation.

Author:
Affiliation:
JAES Volume 64 Issue 9 pp. 654-663; September 2016
Publication Date:

Click to purchase paper as a non-member or you can login as an AES member to see more options.

No AES members have commented on this paper yet.

Subscribe to this discussion

RSS Feed To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.

Start a discussion!

If you would like to start a discussion about this paper and are an AES member then you can login here:
Username:
Password:

If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.

AES - Audio Engineering Society