Short introduction
Concrete Motion is an experimental application for sound-based music (Landy, 2007), designed for educational and training contexts. The system builds on the operational flexibility of Max/MSP in combination with the Google MediaPipe body-tracking system, implemented within the TouchDesigner environment. Its aim is to create an interactive digital learning environment that mediates listening, analysis, and electroacoustic music creation through the body and movement.
The core of the project
At the core of Concrete Motion lies the relationship between music and movement, and between sonic and bodily gesture. This relationship is conceived as a central pedagogical device for making the teaching and learning processes of electroacoustic music and sound-based languages both accessible and engaging. Within this perspective, the project is grounded in the framework of Embodied Music Cognition (Leman, 2008), which understands the body as an active agent in musical understanding rather than a passive interface.
The educational protocol developed for Concrete Motion is positioned at the intersection of the Jaques-Dalcroze approach and Smalley’s spectromorphology. Although these frameworks originate from different historical periods and musical contexts, they converge in assigning a key role to perceptual and sensorimotor dimensions in processes of listening, analysis, learning, and musical description. The explicit integration of movement further connects these perspectives to enactive pedagogy, reinforcing an understanding of learning as embodied and situated action.
The role of technologies
Within this framework, the interactive technologies implemented in Concrete Motion are designed to strengthen the action–perception loop that underpins the relationship between bodily gesture and sonic transformation. The learning environment is conceived as a technologically integrated ecosystem, in which students can explore the system’s expressive possibilities freely and across multiple levels of interaction.
Concrete Motion functions as a gesture-controlled digital environment for real-time sound manipulation and consists of two interdependent modules: a stand-alone application for sound playback and processing, and a free body-tracking plugin based on machine learning libraries. This architecture reflects a view of learning as a distributed cognitive system, in which body, technology, and space jointly contribute to the construction of meaning.
The body-tracking component, developed by integrating MediaPipe within TouchDesigner, enables the recognition of hand movements and, more approximately, of the body in three-dimensional space. Data streams generated by bodily motion are translated into continuous and discrete control parameters acting on playback, delay processes, filtering, dynamics, and dry/wet balance. In line with an enactive perspective, the body is not treated as a mere control interface, but as a constitutive element of both the cognitive and musical processes.
Three interaction modes are supported: control via a graphical interface, gesture-based interaction, and collaborative interaction involving two or more users. The collaborative mode, in particular, supports processes of shared exploration and sonic co-creation within classroom settings.
Methodological approach
The first phase of the research was based on a case study conducted in Italian lower secondary schools, adopting a qualitative methodology that combined video-based educational research with thematic analysis (Braun & Clarke, 2006). The current phase aims to further investigate the role of movement in the understanding of the foundational structures of electroacoustic musical language, with a specific focus on the educational potential of body-tracking technologies in relation to the adopted theoretical frameworks.
Current perspectives
Future developments include the integration of Concrete Motion with IRCAM technologies based on the MuBu – Multi-Buffer system, with particular attention to the development of applications such as Live Motion and Granular Motion. This integration will also allow MuBu to be used as a tool for collecting and analysing movement data, supporting a deeper investigation of the relationship between music and the body and improving the system’s accessibility, responsiveness, and expressive potential within a user-centred design perspective.
Research questions
The research addresses whether electroacoustic music can be considered a foundational language for innovative approaches to music education; how this teaching and learning experiences can be effectively mediated through movement and the body using interactive music systems; and which principles may support the development of new educational models grounded in sound-based musical languages.
Objectives
The project aims to experiment with innovative methodologies and tools in music education; to promote electroacoustic music as a shared cultural and educational resource; to foster critical reflection on the relationship between technology and learning from an ecological and systemic perspective; to contribute to teacher education; and to support the design of highly accessible musical technologies, including Wearable Musical Instruments and Assistive & Adaptive Music Technologies.
![]()