I have proposed a novel paradigm of pianists’ interaction with complex music notation by the name embodied navigation. Its novelty lies in rethinking the classic notion of interpretation as interaction, and performance itself as a dynamic system. The primacy of performers’ embodied experience and the inherent plasticity of music notation are the paradigm’s central features: Embodiment is shown to shape constantly the comprehension of the notation and to transform notation in real time.
The GesTCom (Gesture Cutting through Textual Complexity) has been developed at IRCAM since 2014 in collaboration with the Interaction-Son-Musique-Mouvement team. It materializes the embodied navigation paradigm into a dedicated interactive system. It is a modular sensor-based environment for the analysis, processing and real-time control of complex piano notation through multimodal recordings. In terms of hardware, it comprises systems for the capture of movement, audio, video, MIDI and capacitive data from sensors on the piano keys. In terms of software, it is equipped with modules for the capture, analysis and control of the multimodal data; and modules for the augmentation and interactive control of music notation. Each of these systems functions both as stand-alone and integrated in the general methodology of embodied navigation.