We present a platform for real-time transmission of immersive audiovisual impressions using model- and data-based audio wave-field analysis/synthesis and panoramic video capturing/projection. The audio subsystem considered in this paper is based on microphone arrays with different element counts and directivities as well as weakly directional loudspeaker arrays. We report on both linear and circular setups that feed different wave-field synthesis systems. In an attempt to extend this, we present first findings for a data-based approach derived using experimental simulations. This data-based wave-field analysis/synthesis (WFAS) approach uses a combination of cylindrical-harmonic decomposition of cardioid array signals and enforces causal plane wave synthesis by angular windowing and a directional delay term. Specifically, our contributions include (1) a high-resolution telepresence environment that is omnidirectional in both the auditory and visual modality, as well as (2) a study of data-based WFAS realistic microphone directivities as a contribution towards for real-time holophonic reproduction.
Authors:
Heinrich, Gregor; Jung, Christoph; Hahn, Volker; Leitner, Michael
Affiliations:
Fraunhofer IGD; vsonix GmbH(See document for exact affiliation information.)
AES Convention:
125 (October 2008)
Paper Number:
7608
Publication Date:
October 1, 2008
Subject:
Innovative Audio Applications
Click to purchase paper as a non-member or you can login as an AES member to see more options.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can subscribe to this RSS feed. Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.