➡️ This presentation is part of IRCAM Forum Workshops Paris / Enghien-les-Bains March 2026
MDIF - Musical Descriptor Intuition Field
This demo presents an easy-to-use practical composer-in-the-loop machine learning system.
It can learn a composers subjective preferences from a number of musical examples and turns them into immediate, interactive control, as part of a typical composers workflow.
The main focus is musical structure at a symbolic level (notes, rhythms, register, texture).
- MDIF - example creative workflow
A typical composition workflow in OpenMusic using this system:
-
- setting up some patch generating variants of a structure
- generate a handful of candidates
- place them interactively within your personal subjective cognitive space in a GUI,
- train the MLP
- immediately use the trained model to generate, compare, edit, filter, interpolate new "similar" structure
- Easy and intuitive control of complex algorithms
- What the MLP buys you is a simple and intuitive handle on what otherwise can be complex and chaotic: musically meaningful changes that emerge from awkward, hard-to-steer combinations of parameters, typical for the OM composer.
Instead of controlling dozens of algorithmic parameters directly, you interact with a learned notion of degrees of subjective similarity between your own samples (or class membership), where "similar" means "similar in a subjective artistic sense", not by grammars or rules. The manouvering in the trained model can be done interactively using the provided GUI - a 2D 'joystick', a set of sliders or a radar-plot - or searching and filtering in OM, or perhaps by other means.

- What the MLP buys you is a simple and intuitive handle on what otherwise can be complex and chaotic: musically meaningful changes that emerge from awkward, hard-to-steer combinations of parameters, typical for the OM composer.
- "MDIF - Music Descriptor Intuition Field" - ad-hoc, composer-defined, personal, phenomenological descriptors
User-defined ad-hoc subjective descriptors, meaningful in the personal creative context. E.g. "entropy", "complexity", "texture", "thickness", "sharpness"...
- Architecture
- The system is programmed as a portable Python core for GUI/interaction/MLP/training/inference, and some wrappers for OpenMusic in Common Lisp. The result is an interactive and intuitive ML-system that stays small and editable.
- The ML-system is general, and can be useful in all sorts of compositional tasks. Probably useful in other domains as well
- This demo at the FORUM will show an integration with OpenMusic.
- Keywords:
- composer-in-the-loop interactive ML, MLP, few-shot/ad-hoc ML training, perceptual distance, classification, OpenMusic, Python/Common Lisp integration, phenomenological descriptors