Posté par: Philippe Esling Il y a 2 années, 10 mois
Abstract :
The research project led by the ACIDS group at IRCAM aims to model musical creativity by extending probabilistic learning approaches to the use of multivariate and multimodal time series. Our main object of study lies in the properties and perception of musical synthesis and artificial creativity. In this context, we experiment with deep AI models applied to creative materials, aiming to develop artificial creative intelligence. Our work aims to decipher both complex temporal relationships, and also analyze musical information located at the exact intersection between symbolic (musical writing) and signal (audio recording) representations. Our team has produced many prototypes of innovative instruments and musical pieces in collaborations with renowned composers. Notably, we aim to demonstrate at the Sonar Festival, two of our foremost groundbreaking research prototypes, namely
1/ Neurorack // the first deep AI-based eurorack synthesizer
2/ FlowSynth // a learning-based device that allows to travel auditory spaces of synthesizers, simply by moving your hand
Bio:
Philippe Esling received a B.Sc in mathematics and computer science in 2007, a M.Sc in acoustics and signal processing in 2009 and a PhD on data mining and machine learning in 2012. He was a post-doctoral fellow in the department of Genetics and Evolution at the University of Geneva in 2012. He is now an associate professor with tenure at Ircam laboratory and Sorbonne Université since 2013. In this short time span, he authored and co-authored over 20 peer-reviewed journal papers in prestigious journals. He received a young researcher award for his work in audio querying in 2011, a PhD award for his work in multiobjective time series data mining in 2013 and several best paper awards since 2014. In applied research, he developed and released the first computer-aided orchestration software called Orchids, commercialized in fall 2014, which already has a worldwide community of thousands users and led to musical pieces from renowned composers played at international venues. He is the lead investigator of machine learning applied to music generation and orchestration, and directs the recently created Artificial Creative Intelligence and Data Science (ACIDS) group at IRCAM.
Back to the abstracts collection
Partager sur Twitter Partager sur Facebook
Commentaires
Pas de commentaires actuellement
Nouveau commentaire