Presented by : PHD Candidate Dr Tommaso Colafiglio, Professor Tommaso Di Noia, Professor Fabrizio Festa
Abstract
This research explores a system that integrates Brain-Computer Interfaces (BCIs) with our proprietary advanced machine learning and deep learning models. The system generates real-time images and audio textures based on the emotional and cognitive states of two users within a biofeedback protocol. By employing AI models trained to classify emotional polarity and mental states such as Focus, Relaxation, Stress, and Workload, this system provides a comprehensive understanding of user cognition and emotion.
Applications
1) Sound and Musical Composition
The system introduces a method for sound and musical composition. It analyzes brain signals in real-time to produce musical textures that align with the user’s mental state. During live performances, the interaction between musicians can be monitored to create dynamic auditory or visual feedback. This feedback fosters an engaging dialogue between human creativity and AI-driven responses, enriching the artistic process.
2) Visual Arts
The methodology also extends to visual arts, enabling the generation of dynamic images or videos that synchronise with specific emotional states. Such capabilities pave the way for interactive installations that evolve based on audience engagement, providing new avenues for artistic experimentation and creative expression.
System Workflow
- Emotion and Mental State Recognition - BCIs capture real-time brain signals from two users. Proprietary AI algorithms analyse these signals to identify emotional and cognitive states.
- Customised Image Generation - The system encodes detected emotions into personalised visual outputs that reflect the user’s emotional state.
- Multimedia Emotional Synchronization - Visual and auditory content dynamically adapts to the user’s emotional state, providing an immersive multisensory experience.
- Dynamic Audio Textures - Real-time audio signals are generated and modulated according to the user’s emotions and mental states, enhancing the overall sensory impact.
- Interactive Scriptwriting - Collaborative narratives are shaped by users’ emotional data, allowing them to actively influence story development in real-time.