➡️ This presentation is part of IRCAM Forum Workshops Paris / Enghien-les-Bains March 2026
Cognitive feedback investigates improvisation as a dynamic interplay between neural activity, pre-composed spectral intelligence, and live sound.

The performer’s BrainLink Pro headset captures eight EEG frequency bands—Delta, Theta, Low/High Alpha, Low/High Beta, Low/High Gamma—which are processed in Python to extract cognitive descriptors reflecting attention, relaxation, and oscillatory dynamics. These parameters modulate synthesis, spatialisation, and algorithmic transformations in Max/MSP, creating a live sonic environment that responds in real time to the performer’s mental state.

OpenMusic serves as a framework for algorithmic generation of spectral templates, partial distributions, structural seeds for improvisation, and for sound synthesis of fixed musical layers. Partiels is used in an offline analytical phase to provide high-resolution spectral decomposition of pre-composed materials. The analysis guides the mapping strategies and spectral vocabulary implemented in Max/MSP, ensuring that live EEG-driven improvisation unfolds within a rigorously structured, yet flexible sonic landscape.

By integrating offline spectral intelligence with real-time neuro-sonic feedback, the piece situates improvisation at the intersection of cognition, algorithmic reasoning, and auditory perception. The performer negotiates between intentional focus and emergent system behaviour, revealing improvisation as a neuro-sonic ecosystem in which thought, pre-analysed spectral structures, and sound co-evolve.
