Since the early days of electronic recordings, we’ve been using artificial reverberation to put dry studio sounds into a ‘room’. Sometimes using a real room, but other times using rich sounding metal plates, or even complex networks of delayed sounds. These effects have since become an important part of a creative workflow well beyond the acoustics of a room they originally intended to recreate. With the rise of immersive technologies like augmented reality, the techniques to reproduce an acoustics space are quickly evolving to adapt to these new realities and transport us inside a space. How can we recreate the perception of an existing space in six degrees of freedom in real time? In this presentation, we will discuss some key characteristics of the new approaches being developed and specifically, why and how they can be used. We will also present the working prototype of a new reverberation tool being developed in the Acoustic and Cognitive Spaces at IRCAM.
Benoit Alary is a researcher in the Acoustic and Cognitive Spaces at IRCAM. Originally from Canada, he recently obtained a PhD in acoustics and signal processing from Aalto University in Finland and a MSc from the University of Edinburgh. Before his PhD, he also worked for over than ten years in the video game industry, where he developed sound technologies for companies such as Ubisoft, Electronic Arts, and Audiokinetic.