The research interest on modeling everyday human emotions and controlling them through typical multimedia content (i.e. audio and video data) has recently increased. In this work, an interactive methodology is introduced for detecting, controlling and tracking emotions. Based on the above methodology, an interactive audiovisual installation termed “Elevator” was realized, aiming to analyze and manipulate simple emotions of the participants (such as anger) using simplified emotion detection audio signal processing techniques and specifically selected combined audio/visual content. As a result, the human emotions are “elevated” to pre-defined levels and appropriately mapped to visual content which corresponds to the emotional “thumbnail” of the participants.
Authors:
Psarras, Basileios; Floros, Andreas; Strapatsakis, Marianna
Affiliation:
Dept. of Audiovisual Arts, Ionian University, Corfu, Greece
AES Convention:
126 (May 2009)
Paper Number:
7657
Publication Date:
May 1, 2009
Subject:
Audio for Games and Interactive Media
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can
subscribe to this RSS feed.
Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.