Despite advancements in AI for games in recent years, non-player characters (NPCs) still do not perceive the world in a realistic manner. NPC’s sense of hearing has been limited or ignored. Building on our previous work that saw the development of GrAF, a graph-based spatial sound framework capable of modelling the propagation of sound through complex three dimensional virtual environments in real-time, here we apply this method to NPCs, providing them with the ability to “perceive” sounds in a more realistic manner, ultimately leading to more realistic NPC behavior.
Authors:
Cowan, Brent; Kapralos, Bill; Collins, KC
Affiliations:
Ontario Tech University; University of Waterloo(See document for exact affiliation information.)
AES Conference:
2020 AES International Conference on Audio for Virtual and Augmented Reality (August 2020)
Paper Number:
2-6
Publication Date:
August 13, 2020
Download Now (952 KB)
This paper is Open Access which means you can download it for free.
No AES members have commented on this paper yet.
To be notified of new comments on this paper you can
subscribe to this RSS feed.
Forum users should login to see additional options.
If you are not yet an AES member and have something important to say about this paper then we urge you to join the AES today and make your voice heard. You can join online today by clicking here.