Afficher les articles écrits par jmjot

Keynote Envisioning a Future Music and Audio Metaverse - Jean-Marc JOT

During the thirty years since the development of Ircam’s Spat began, professional and consumer audio technology has progressed along several parallel threads – including sensory immersion, electronic transmission, content formats and creation tools. In a not-too-distant future, the authoring, consumption or performance of a significant portion of our media and music experiences might leverage a global set of frameworks and ecosystems often referred to as the Metaverse. In time, more and more of our media experiences (currently categorized into separate content industries such as music, movies and podcasts) may be cloud-based, navigable, non-destructive, ubiquitous, interoperable and adaptable to listener conditions. In this talk, we attempt to distill elements of this vision and some of the challenges that it entails, including the adoption of a common spatial audio rendering description model, and “externalized” binaural audio reproduction for AR/VR sound.

Workshop Binaural Externalization Processing - Jean-Marc JOT

In both entertainment and professional applications, conventionally produced stereo or multi-channel audio content is frequently delivered over headphones or earbuds. Use cases involving object-based binaural audio rendering include recently developed immersive multi-channel audio distribution formats, along with the accelerating deployment of virtual or augmented reality applications and head-mounted displays. The appreciation of these listening experiences by end users may be compromised by an unnatural perception of the localization of frontal audio objects: commonly heard near or inside the listener’s head even when their specified position is distant. In this demonstration, examples are presented to illustrate the differences between audio rendered with traditional stereo panning, binaural processing, and a recently proposed externalization processing method.