"The Demonstration of BCI Interactive Soundscape Interface for Soundscape Composition" by Yi-hsien Chen (Taiwan)

The demo presents a “BCI Interactive Soundscape Interface” using the Ultracortex, a wearable EEG detector developed by Open BCI team. In this demo, I explore the potentials of bio-feedback systems in soundscape composition by employing EEG signals to manipulate the electronic sound processing modules and spatial parameters based on SPAT to transform the environmental sounds using user's mental states.

This demo showcases a “BCI Interactive Soundscape Interface” (BCI stands for brain computer interface) using the Ultracortex, a wearable EEG headset developed by Open BCI team, integrated with Max/MSP. In this project, the Ultracortex records user’s brainwave activity while he or she listens to various soundscape materials. These brainwave signals are then sent to Max/MSP to control electronic sound processing modules and spatialization in real-time within multi-channel system. The interface is designed to render sounds through 16-channel system, transforming the sounds into abstract sonic textures. The transformed sounds, in turn, influence the user’s brain activity, establishing a continuous, bidirectional interaction between the mind and the surrounding evolving sounds. This project aims to provide an interface through which participants - including non-musician - can play soundscape materials using their own mental states.