➡️ This presentation is part of IRCAM Forum Workshops Paris / Enghien-les-Bains March 2026
Somax for Live
As part of the REACH project in the Music Representation team at IRCAM, Somax for Live brings the real-time interactive capabilities of Somax2 directly into Ableton Live.
Developed by Manuel Poletti in collaboration with Marco Fiorini and Gérard Assayag, this new integration bridges advanced symbolic AI improvisation with a widely used digital audio workstation, opening new creative workflows for composers, performers, and producers.
Implemented as a collection of Max for Live devices, Somax for Live allows users to interactively co-create with the system within Live’s native environment, combining the temporal and stylistic modeling of Somax2 with the flexibility of Live’s clips, automations, and control interfaces. This tight coupling between musical intelligence and production tools encourages a fluid dialogue between human and machine musicianship, enabling adaptive accompaniment, generative composition, and exploratory performance practices within an accessible and modular setup.
This presentation will showcase the architecture, interaction paradigms, and artistic use cases of Somax for Live, illustrating how the REACH project advances hybrid human–AI co-creativity in contemporary music-making.

Prosax
Prosax_001.maxpat is a research patch, a proof-of-concept one, intended to explore prosodic profiles on segmented audio to generate event labels for OMAX and SOMAX2. It is not a finished, a completed one. It is a work in progress. A process mainly intended for speech segmentation, but you can use it for other audio cases, with more and less success. Just explore.
The main ideas in this work emerged from Artistic Research carried out with Valérie Philipin (to whom we are indebted for her musical and literary expertise), from October 2023 to June 2024 (In the Reach project context), on the use of Somax2 in a spoken and sung voice context.
This patch is an adaptation of <pipo. sylseg> help patch (from Mubu for Max Max Package) and based on the Nicolas Obin, François Lamare and Axel Roebel research [Obin, Lamare, Roebel 2013]; and needs the previous installation of the last “MuBu For Max” package developed by the ISMM Team at Ircam (https://ismm.ircam.fr/mubu/)..
https://github.com/DYCI2/prosax
Somax2Collider

Somax2Collider is a SuperCollider-based front-end designed to control the Somax2 server, enabling the real-time creation of musical agents within a dynamic multi-agent ecosystem where multiple agents can perform simultaneously. The project provides a flexible framework for spatialized multi-agent performance, live coding, and experimental co-improvisation practices, opening new perspectives for interacting with Somax-style musical agents.
In this presentation, I will demonstrate the use of the latest version of Somax2Collider in an ambisonic environment, as well as within a system of autonomous networked loudspeakers. This system has already been used in several mixed-music compositions and improvised performances.