Ph.D Thesis 2025-2028 – Curiosity driven and lifelong learning for mixed initiative musical co-creativity

Context

Current Human-AI co-creative systems, or Mixed-Initiative Co-Creative systems (MICC), have enabled new forms of musicking [1] by expanding creative possibilities and facilitating novel modes of interaction [2]. However, these systems still lack robust mechanisms for long-term adaptation between user and machine. Typically, adaptation occurs either through unilateral learning—where the user progressively adjusts to the AI’s affordances [3]—or through self-engineered modifications, where musician-programmers iteratively refine their own systems [4]. The objective of this PhD project is to address this limitation by exploring curiosity-driven learning techniques for lifelong adaptation in MICC musicking, allowing AI agents to develop dynamically evolving creative relationships with human users.

This research is positioned at the convergence of four pivotal areas: Music Information Retrieval (MIR), Lifelong Learning and Personalisation in Long-Term Human-AI Interaction (LEAP-HRI), New Interfaces for Musical Expression (NIME), and Computational Creativity (CC). While MIR has driven advances in music generation, emphasizing autonomous neural network-based systems [5] and interactive AI-assisted composition [6] and co-improvisation [7], lifelong learning research in Human-Robot Interaction (HRI) has focused on enabling dynamic, adaptive relationships between humans and machines through mechanisms like reinforcement learning [10] and curiosity-driven learning [11], in order to personalise the interactions with the users [12]. Considering interaction, NIME explores innovative interfaces and interaction methods within musical settings, providing expressive, intuitive ways for musicians to engage with AI systems [13] and audiences [14] in real time and creating rich contexts for human-machine co-creativity. Finally, within CC, frameworks such as Rhodes’ Four Ps (Person, Process, Product, Press) [15] and Lin et al.’s design space [16] offer essential criteria for evaluating the dynamics and outcomes of co-creative interactions, helping to assess and refine the creative synergy between human and AI in musical applications.

Work Plan

The central objective of this research is to apply curiosity-driven learning approaches in AI-based music generation models that can evolve over time, facilitating co-creative musical interactions through sustained adaptation. Here, musical co-creativity is conceptualized as an exploration problem, where the AI agent actively expands its musical landscape by seeking novelty and information gain. The agent will leverage intrinsic motivation to maximize novelty in its output, tracking novelty, information gain, or learning progress to encourage generative exploration.

First, curiosity-driven learning techniques will be implemented and iteratively tested, using an intrinsic reward system based on objective metrics such as novelty search or diversity maximization. Then, we will combine intrinsic motivation with a supervised component informed by user feedback, creating an inductive bias that guides exploration. In this way, user feedback actively shapes the contextual aesthetics and creative values that guide the agent’s exploration.

To deepen co-creativity, personalisation strategies will allow the system to develop abstract user models based on individual aesthetic preferences. These models will compactly encode each user’s “novelty frontiers,” directing the agent’s curiosity-driven exploration towards areas aligned with the user’s interests and avoiding creative redundancies. In parallel, the project will investigate multi-agent capabilities that support interactions within collective musical contexts, where a single AI agent or multiple collaborative agents can simultaneously engage and adapt to multiple musicians, facilitating both solo and ensemble co-creativity.

The project will place a strong emphasis on the development of interfaces and multi-agent frameworks designed for Human-Machine Interaction (HMI) in musical settings. In collaboration with musicians, it will explore interactive musical interfaces that encourage creative interventions and support multi-modal engagement (e.g., through sound, visuals, or tactile feedback). Ethnographic methods will enrich this process by capturing musician perspectives through interviews, experimental sessions, and listening activities, gathering qualitative feedback to refine and validate the AI models and interactive systems. This iterative evaluation will aim to capture the evolution of the user-machine creative relationship over time.

Deliverables will include publications within the MIR, ML, and HMI communities, as well as open-source code releases to support ongoing innovation in MICC musicking.

Candidates profile

Candidates should have a Master’s degree in Computer Science, with experience in either machine learning, human-computer interaction, or computer music. A strong interest for music and a musical practice is highly recommended. We particularly welcome applications from women and under-represented groups in music and computer science research.

Work environment

The Algomus team is a friendly team of 10+ scientists and students. The team meets while sharing and talking on music and science. The Algomus team is used to publish at major conferences and journals in the field. The PhD student will be guided in writing and submitting papers to these conferences, whether on his/her own work or on collaborative work inside or outside the team. The PhD student will regularly participate at conferences and other events in the field, and will be encouraged to collaborate with other scientists and artists. Each PhD student in the Algomus team also undertakes an international research stay of 2-3 months during the course of his or her thesis.

References

  • [1] Small, C. (1998). Musicking: The meanings of performing and listening. Wesleyan University Press.
  • [2] Jordanous (2017). Co-creativity and perceptions of computational agents in co-creativity. International Conference on Computational Creativity.
  • [3] Nika et al. (2017). DYCI2 agents: merging the ‘free’, ‘reactive’ and ‘scenario-based’ music generation paradigms. International Computer Music Conference.
  • [4] Lewis (2021). Co-creation: Early steps and future prospects. Artisticiel/Cyber-Improvisations.
  • [5] Herremans et al. (2017). A functional taxonomy of music generation systems. ACM Computing Surveys, 50(5).
  • [6] Déguernel et al. (2022). Personalizing AI for co-creative music composition, from melody to structure. Sound and Music Computing.
  • [7] Déguernel et al. (2018). Probabilistic factor oracles for multidimensional machine improvisation. Computer Music Journal, 43:2.
  • [8] Parmentier et al. (2021). A modular tool for automatic Soundpainting query recognition and music generation in Max/MSP. Sound and Music Computing.
  • [9] Parisi et al. (2019). Continual lifelong learning with neural networks: A review. Neural networks, 113.
  • [10] Scurto et al. (2021). Designing deep reinforcement learning for human parameter exploration. ACM Transactions on Computer-Human Interaction, 28(1).
  • [11] Colas et al. (2020). Language as a cognitive tool to imagine goals in curiosity driven exploration. NeurIPS.
  • [12] Irfan et al. (2022). Personalised socially assistive robot for cardiac rehabilitation: Critical reflections on long-term interactions in the real world. User Modeling and User-Adapted Interaction.
  • [13] Pelinski et al. (2022) Embedded AI for NIME: Challenges and Opportunities. In Workshop at the International Conference on New Interfaces for Musical Expression.
  • [14] Capra et al. (2020). All you need is LOD: Levels of detail in visual augmentations for the audience. In The 20th International Conference of New Interfaces for Musical Expression.
  • [15] Jordanous (2016). Four PPPPerspectives on computational creativity in theory and in practice. Connection Science, 28(2).
  • [16] Lin et al. (2023). Beyond prompts: Exploring the design space of Mixed-Initiative Co-Creativity Systems. International Conference on Computational Creativity.
  • [17] Burda et al. (2019). Large-Scale Study of Curiosity-Driven Learning. *International Conference on Learning Representations.
  • [18] Ten et al. (2022). Curiosity-driven exploration. The Drive for Knowledge: The Science of Human Information Seeking, 53.