Music begins and ends in the human brain.
Our bodies interface with musical instruments for the musical mind to express itself.
The computer has become the musical instrument of our times enabling many new ways to play.
Brain-Computer Music Interfacing (BCMI) potentially takes these ideas a step further by considering the brain itself as musical instrument whose electroencephalography (EEG) signals can be harnessed to enable new channels and modes of expression in live performance.
Biofeedback has been explored earnestly by the arts for the past 2 decades as a way to interact with or express oneself with one's own physiological state. BCMI systems use the EEG signal, algorithmically transforming and mapping it to outputs in formats such as MIDI, OSC or DMX useful for controlling media. Research aimed at how to obtain, process, and transform the EEG signal into useful parameters for live performance has been ongoing at JVLMA since 2019.
The BCMI system developed within the frames of this project was based on decoding the expressive intentions of a performer in two contrasting states: high arousal and low arousal. This was done by characterising spectral power during emotionally expressive music performance relative to emotionally neutral music performance. This paradigm has been explored in various live concert settings from modular synthesis to orchestral percussion.
Current efforts aim to extend these tools for multiple users, in which inter-brain dynamics during co-creative tasks can be used to manipulate immersive multimedia. This would enable shared brain activity to play a role in the creation or experience of art.
This talk is part of IRCAM Forum Workshops Hors-les-Murs 2025 Rīga-Liepāja (Latvia)