➡️ This presentation is part of IRCAM Forum Workshops Paris / Enghien-les-Bains March 2026
When improvising with electronic instruments and MIDI controllers, the gestural embodiment and sonic reactivity of the interface can often feel rigid, lacking the subtle nuances naturally achievable with acoustic instruments.
AI-based sound re-synthesis introduces an element of unpredictability that enhances variability within the performance environment. This dynamic quality can foster a more expressive and responsive playing experience.
The implementation of RAVE in Max will be presented, followed by an improvised performance integrating this technology. Techniques for real-time exploration and exploitation of the system will be demonstrated.