Job offers / Offres de visites, stages, thĂšses et emploi
Feel free to contact us at contact@algomus.fr if you are interested in joining our team, even if the subject is not detailed below. You can also have a look on job offers in the previous years to see the kind of subjects we like to work on.
Visits of PhD students or post-docs (between 2-3 weeks and 2-3 months) can be also arranged through mobility fundings. Contacts should be taken several months in advance. (open)
2026 Internship positions will be available (2-6 months), this year including the following themes:
- Co-creativity, Machine learning: Machine Listening for Reinforcement Learning with Agent-based Music Performance Systems (K. Déguernel, C. Panariello) (open)
- Music structure: Similarities across scales and musical dimensions (F. Levé, in Amiens, Y. Teytaud) (open)
- Corpus / Web development: Corpus musicaux sur la plateforme web Dezrann (E. Leguy, M. Giraud) (soon)
- Pedagogy / Music training: GĂ©nĂ©ration d’exercices mĂ©lodiques et rythmiques de haute qualitĂ© (M. Giraud, F. LevĂ©, with Solfy) (closed)
- Video Game Music / Corpus / Music Perception, Loop in Video Game Music: building and analyzing the UFO-50 corpus (Y. Teytaut, F. Levé) (soon)
Some other detailed subjects may be posted in November 2025.
PhD positions, 2026-29 will be published soon.
Stages 2026 de recherche et/ou développement en informatique musicale
Machine Listening for Reinforcement Learning with Agent-based Music Performance Systems
- Final year of Master’s degree internship
- Duration: 4-6 months, with legal “gratification” (550âŹ-600âŹ/month)
- Location: Lille (Villeneuve d’Ascq, Laboratoire CRIStAL, mĂ©tro 4 Cantons); partial remote work is possible
- Supervisors and contacts: Ken Déguernel (CR CNRS) & Claudio Panariello (postdoc Univ. Lille)
- Full offer, and links: https://www.algomus.fr/jobs
- Candidatures ouvertes - Open applications
Context
This internship takes place in the scope of the MICCDroP project, which aims at integrating mechanism of continual learning for long-term partnerships in AI-human musical interactions with agent-based music performance systems. In particular, one aspect of this project is the use of methods based on reinforcement learning and curiosity-driven learning to equip an AI agent with mechanisms for adaptive engagement in creative processes across multiple practice sessions.
This project sits at the intersection of several key areas, primarily Music Information Retrieval (MIR), Lifelong Learning for Long-Term Human-AI Interaction (LEAP-HRI), and New Interfaces for Musical Expression (NIME). It will be supervised by Ken Déguernel (CNRS Researcher) and Claudio Panariello (Univ. Lille postdoc and composer).
Objective
When using reinforcement learning, user feedback can easily be gathered during the interaction through external means (button, pedal, gesture) to indicate whether the interaction is good or not, or a posteriori during reflective session. The goal of this internship, however, is to develop machine listening methods in order to gather the feedback through the musical interaction itself. Several dynamics of interaction will be tested: level of engagement, consistency of play, a/synchronicity…
Tasks:
- Explore and implement different machine listening methodologies.
- Integrate the developed machine listening system into existing MICCDroP musical agent.
- Test the system in situ with professional performers.
Qualifications
Needed:
- Last year of Master’s degree in Machine Learning, Music Computing
- Strong background in Signal Processing for Audio
Preferred:
- Experience with music programming languages (Max/MSP, SuperCollider, …)
- Personal music practice
References
- Jordanous (2017). Co-creativity and perceptions of computational agents in co-creativity. International Conference on Computational Creativity.
- Nika et al. (2017). DYCI2 agents: merging the âfreeâ, âreactiveâ and âscenario-basedâ music generation paradigms. International Computer Music Conference.
- Collins, N. (2014). Automatic Composition of Electroacoustic Art Music Utilizing Machine Listening. Computer Music Journal 36(3).
- Tremblay, P.A. et al. (2021). Enabling Programmatic Data Mining as Musicking: The Fluid Corpus Manipulation Toolkit. Computer Music Journal, 45(2).
- Scurto et al. (2021). Designing deep reinforcement learning for human parameter exploration. ACM Transactions on Computer-Human Interaction, 28(1).
- Parisi et al. (2019). Continual lifelong learning with neural networks: A review. Neural networks, 113.
- Small, C. (1998). Musicking: The meanings of performing and listening. Wesleyan University Press.
Stage de recherche (M2) 2026: Similarities across scales and musical dimensions
- Final year of Master’s degree internship
- Duration: 5-6 months, with legal âgratificationâ (550âŹ-600âŹ/month)
- Themes: MIR, symbolic music
- Lieu: Amiens (laboratoire MIS, UPJV, collaboration avec Algomus)
- Encadrement et contacts: Florence Levé (MIS) et Yann Teytaut (CRIStAL)
- Annonce et liens: https://www.algomus.fr/jobs
- Candidatures ouvertes - Open applications
Context
Musical data represent a considerable amount of information of different nature. However, despite the impressive results of recent generative models, the structural properties of music are underutilized, due to the lack of a generic paradigm that can account for the similarity relationships between elements, temporal sequences, sections of one or several pieces at various levels of representation, across different musical dimensions (sound objects, melody, harmony, texture, etc.). Current tools fail to render this multiscale structure in musical data, in particular they donât allow a real control on the parameters of the music that is composed (or generated). The goal of the ANR Project MUSISCALE is to develop methods and software tools that can account for the relationships between similar elements at different scales, finely enough to be able to recompose a complete musical object from these elements, or to create variations of it.
Objectives
The goal of this internship is
- to explore different notions of similarity between symbolic objects (not only related to pitches, but rhythm, texture, or other similarity criteria on different scale levels will be considered)
- to model and implement algorithms to automatically analyze music scores
- to study the relation between those segmentations and the global form of the pieces
- to propose transformations of the objects for creative purposes, and study the impact of those transformations on the global form.
Although this subject is mostly on symbolic music, timbre or audio analysis/transformation are part of some extensions that could be considered.
Qualifications
- Last year of Master in Computer Science or Music Computing
- Knowledge in music theory is recommended,
- Ideally, musical practice
- Candidates at ease both with symbolic and audio analysis are welcome
Opportunities
The ANR funding includes opportunities to pursue a PhD in our lab on this topic or related topics.
Références
-
ALLEGRAUD P ., BIGO L., FEISTHAUER L., GIRAUD M., GROULT R., LEGUY E., LEVĂ F. âLearning Sonata Form Structure on Mozartâs String Quartetsâ. Transactions of the International Society for Music Information Retrieval (TISMIR), 2(1):82â96, 2019.
-
BHANDARI, K., and COLTON, S. “Motifs, phrases, and beyond: The modelling of structure in symbolic music generation.” In International Conference on Computational Intelligence in Music, Sound, Art and Design (Part of EvoStar), pp. 33-51. Cham: Springer Nature Switzerland, 2024.
-
BUISSON M., McFEE B., ESSID S., and CRAYENCOUR H-C. âLearning Multi-level Representations for Hierarchical Music Structure Analysisâ. In Proceedings of the International Society for Music Information Retrieval Conference (ISMIR), 2022.
-
CALANDRA, J., CHOUVEL, J. M., & DESAINTHE-CATHERINE, M. “Hierarchisation algorithm for MORFOS: a music analysis software.” In Proceedings of the International Computer Music Conference (ICMC), 2025.
-
COUTURIER L., BIGO L., and LEVĂ F. âComparing Texture in Piano Scoresâ. In Proceedings of the International Society for Music Information Retrieval Conference (ISMIR), 2023.
-
NIETO O., MYSORE G.J., WANG C. et al. âAudio-Based Music Structure Analysis: Current Trends, Open Challenges,and Applicationsâ. Transactions of the International Society for Music Information Retrieval (TISMIR), 3(1):246â263, 2020
Stage de recherche: GĂ©nĂ©ration d’exercices mĂ©lodiques et rythmiques de haute qualitĂ©
- Xiaobo Wang, Stage M2 2026, 5 mois, 15 février - 15 juillet 2025
- ThÚmes: informatique musicale, pédagogie, solfÚge, mélodie, rythme
- Lieu: Villeneuve d’Ascq, Laboratoire CRIStAL, mĂ©tro 4 Cantons
- Encadrement: Mathieu Giraud, Algomus, Florence Levé, UPJV, Anicet Bart, Solfy,
Contexte
Ce stage s’effectue au sein de l’Ă©quipe d’informatique musicale Algomus, en lien avec l’entreprise Solfy. Depuis dĂ©but 2025, cette collaboration porte sur la gĂ©nĂ©ration d’exercices de solfĂšge via la plateforme open-source Ur. Le but du stage est d’aboutir Ă une gĂ©nĂ©ration de lecture de notes et de lecture de rythme de ‘hautĂ© qualitĂ©’ musicale et pĂ©dagogique. On cherchera en particulier Ă obtenir une certaine cohĂ©rence mĂ©lodique, harmonique, rythmique, tout en fixant des critĂšres de difficultĂ© (notes, figures rythmiques).
ConcrĂštement, le stage commencera par un Ă©tat de l’art en informatique musicale, plus particuliĂšrement en estimation de difficultĂ© et en gĂ©nĂ©ration de mĂ©lodies et de rythmes, Ă l’apprentissage de la plateforme Ur et du code prĂ©cĂ©demment rĂ©alisĂ©, ainsi qu’avec des Ă©changes avec Solfy et des professeurs de musique pour mieux modĂ©liser les enjeux. Il sera aussi pertinent de consulter un ensemble de mĂ©thodes de formation musicale.
Le stage confortera et proposera de nouvelles méthodes de génération et/ou de filtrage de mélodies, et testera ces méthodes par un prototype au sein de la plateforme Ur. Le stage contiendra aussi du génie logiciel sur cette plateforme. Les méthodes développées au cours du stage seront publiées en open-source.
Profil recherché
Master dâinformatique, compĂ©tences en programmation, et algorithmique. Connaissances et pratique musicales apprĂ©ciĂ©es.
Débouchés
Des opportunitĂ©s de poursuite en thĂšse Ă CRIStAL pourraient aussi ĂȘtre envisagĂ©es sur ce sujet via une thĂšse CIFRE avec l’entreprise Solfy, ou d’autres opportunitĂ©s sur d’autres sujets avec l’Ă©quipe ou ses collaborateurs
Références, ressources
- La plateforme open-source Ur
- Solfy: application élÚve, application enseignant