AI helps create soundtrack for breathtaking Mercury flyby

An extraordinary new film showing the planet Mercury appear from the shadows is soundtracked with the help of state-of-the-art artificial intelligence (AI) tools developed at the University of 葫芦影业.

A video of BepiColombo鈥檚 third planet Mercury flyby

The film shows breathtaking footage captured by the European Space Agency (ESA)/  Japan Aerospace Exploration Agency (JAXA) in its third flyby of Mercury in what is Europe鈥檚 first mission to the closest planet to the sun.

The ESA commissioned the acclaimed artist to compose a fitting soundtrack for the flybys, with the latest on offering a rare glimpse of Mercury鈥檚 night side. 

滨尝膧 composed the music for the remarkable sequence with the assistance of the Artificial Musical Intelligence (AMI) tool developed by the University鈥檚 Machine Intelligence for Musical Audio (MIMA) research group.

Dr Ning Ma, from the Department of Computer Science, explains how the technology works: 鈥淎MI uses AI to discover patterns in musical structures such as melodies, chords and rhythm from tens of thousands of musical compositions. 

鈥淚t encodes music data in a way that is similar to reading a music score, enabling the technology to better capture musical structures. The learning of musical structures is enhanced by adding phrasing embeddings - expressive nuances and rhythmic patterns captured numerically - at different time scales. As a result, AMI is able to generate compositions for various musical instruments and different musical styles with a coherent structure.鈥

滨尝膧, MIMA collaborator and Creative Director of sonic branding agency Maison Mercury Jones, composed the music for the first two Mercury flyby movies with artist Ingmar Kamalagharan. The two compositions formed the basis of the third.

鈥淲e wanted to know what would happen if we fed AMI the ingredients from the first two compositions,鈥 said 滨尝膧. 

鈥淭his gave us the seeds for the new composition which we then carefully selected, edited and weaved together with new elements.

鈥淯sing this technology almost acts as a metaphor for the Mercury mission; there's a sense of excitement but there's also trepidation, much like with AI.鈥

An image showing part of the region of Mercury covered by the ESA flyover sequence, reconstructed as a 3D anaglyph

For 滨尝膧, the early adoption of AI in their music follows a career of breaking boundaries.

鈥淚鈥檝e had a lifelong fascination with technology and consciousness, particularly what makes us human,鈥 滨尝膧 continued.

鈥淪o the idea of AI in general and the possibilities it raises have fascinated me ever since I watched the film Metropolis as a child.

鈥淚鈥檝e always been into music technology and in some ways generative composition, like with AMI, feels like an almost natural progression.

鈥淔or me there are two strands to it - one is the questioning of our usefulness as human creators, and this can create a lot of fear among artists. There鈥檚 also another fear at play - the fear of missing out or being left behind.

鈥淚鈥檓 finding that the more you work with this type of technology, the more it can teach you. In that sense I鈥檓 interested in what it can teach us about ourselves as human creators. I think it鈥檚 one of those things that once we see what it can do, it changes things forever.

鈥淚t's a little bit like when humans first saw the waveform. As soon as you see that representation of an audio wave, you can't unsee it and it changes how you hear and experience music. I think we're at the dawn of that with AI as well.鈥

From a user perspective, AMI is an assistive compositional tool. The user interface allows composers/ creators to upload a MIDI file to provide a starting point for a new composition. AMI will then generate new musical material, based on this musical seed. Composers can adjust musical parameters such as instrumentation, musical mode, and metre through to more abstract qualities such as 鈥榓dventurousness鈥.

鈥淎 lot of the time people's reaction initially is: 鈥業sn't using AI cutting corners and making life easy for yourself?鈥欌 滨尝膧 added.

鈥淚n reality it takes longer. It takes more time because one of the things that you come up against is sort of choice paralysis, because you have dozens, if not hundreds more options for every single juncture. A lot of composition is about making choices. With AMI you make every decision, but with a wider range of possibilities at your disposal.鈥

Professor Guy Brown, Head of the University鈥檚 Department of Computer Science, said: 鈥淲e really value the collaborative research that we are doing with 滨尝膧, and it鈥檚 so gratifying to see our AI tools being used to make such beautiful music. Our goal is very much to work with musicians to develop AI systems that support and extend their creative endeavours, rather than replace them at the touch of a button. 滨尝膧鈥檚 music for the BepiColombo mission is a perfect example of human composer and AI system working in perfect harmony.鈥

Professor Nicola Dibben, from the University鈥檚 Department of Music, said: 鈥淎I is about to have a huge impact on music making and dissemination. We鈥檙e proud to be working with 滨尝膧 to understand the implications of AI music generation and to create a future for Music AI together which is fair and inclusive.鈥

Centres of excellence

The University's cross-faculty research centres harness our interdisciplinary expertise to solve the world's most pressing challenges.