Posté par: Mikhail Malt Il y a 1 année, 8 mois
Somax 2.5 is an application and a library for live co-creative interaction with musicians in improvisation composition or installation scenarios.
It is based on a machine listening, reactive engine and generative model that provide stylistically coherent improvisation while continuously adapting to the external audio or midi musical context. It uses a cognitive memory model based on music corpuses it analyzes and learns as stylistic bases, using a process similar to concatenative synthesis to render the result, and it relies on a globally learned harmonic and textural knowledge representation space using Machine Learning techniques.
Somax2 has been totally rewritten from Somax, one of the multiple descendants of the well known Omax developed in the Music Representation team over the years and now offers a powerful and reliable environment for co-improvisation, composition, installations, etc. Written in Max and Python, it features a modular multithreaded implementation, multiple wireless interacting players (AI agents), new UI design with tutorials and documentation, as well as a number of new interaction flavors and parameters.
It is based on a machine listening, reactive engine and generative model that provide stylistically coherent improvisation while continuously adapting to the external audio or midi musical context. It uses a cognitive memory model based on music corpuses it analyzes and learns as stylistic bases, using a process similar to concatenative synthesis to render the result, and it relies on a globally learned harmonic and textural knowledge representation space using Machine Learning techniques.
Somax2 has been totally rewritten from Somax, one of the multiple descendants of the well known Omax developed in the Music Representation team over the years and now offers a powerful and reliable environment for co-improvisation, composition, installations, etc. Written in Max and Python, it features a modular multithreaded implementation, multiple wireless interacting players (AI agents), new UI design with tutorials and documentation, as well as a number of new interaction flavors and parameters.
In the new 2.5 version, it is also now designed as a Max library, allowing the user to program custom Somax2 patches allowing everybody to design one's own environment and processing, involving as many sources, players, influencers, renderers as needed. With these abstractions, implemented to provide complete Max-style programming and workflow, the user could achieve the same results as the Somax2 application but, thanks to their modular architecture, it is also possible to build custom patches and unlock unseen behaviors of interaction and control.
Somax2 is developed by the Music Representation team at IRCAM and is part of ANR project MERCI (Mixed Musical Reality with Creative Instruments) and ERC REACH (Raising Co-creativity in Cyber-Human Musicianship) project.
More at repmus.ircam.fr/somax2
Commentaires
Pas de commentaires actuellement
Nouveau commentaire