Skip to main content

Lord Watson writes about the challenges and opportunities that AI presents to the UK music industry.

The music industry has been using AI for years as an assistive tool for a range of tasks, from helping producers clean up music to detecting copyright breaches and predicting consumer trends. However, the music business, like many other sectors, is grappling with the explosion in potential uses of AI which present huge opportunities along with great challenges.

As the chair of UK Music, the body that champions the UK music industry, I want to see our sector continue to produce the music professionals that are the envy of the world and generate even more than the £6.7 billion it currently contributes annually to our economy. 

There is no doubt that AI can play a part in that success, particularly when it comes to its use in a supportive role. We saw this recently when Sir Paul McCartney used AI as an assistive tool on the final Beatles song that included vocals from the late John Lennon. Importantly, McCartney swiftly clarified he was not using AI to generate a new recording of Lennon’s voice, but using AI to clean up an old recording made by the band using a process called ‘stem separation.’

However, the rapid development and implications of generative AI technologies, where AI actually generates music, raises many challenges and hard questions for legislators, music industry leaders and the 210,000 talented people who work in the UK music sector. 

It is vital to distinguish between AI generating and creating new music; it is capable of the former, but not the latter. AI-generated works rely entirely on ingesting music made by human creators. The AI copies thousands of pieces of music and then analyses patterns and structures to generate a composition based on that computation.

The key point is that this music is being copied. More often than not, that music is copyrighted, and therefore the express permission of the copyright holder is needed and compensation is required. If copyright is not properly upheld, both the creator and the UK sector lose out. 

Unfortunately, we know some overseas businesses are using copyrighted music to train AI technologies without the consent of the human creators and also without payment – and with a flagrant disregard for the UK’s successful copyright laws which are a cornerstone of our world-beating music industry. 

For individual creators not being paid for their work is not only damaging to their income but it also greatly hampers the ability of music businesses to invest in new projects and artists, which could seriously damage the industry’s talent pipeline.

Instead of allowing AI to pilfer the work of talented artists, we should be investing in that talent pipeline to help develop new acts to ensure we are creating the stars of tomorrow, especially with increasing global competition from fast-growing markets in South America and South Korea, as well as the persistent strong competition from Europe and the USA. 

Music Business Worldwide has estimated that over 40 million ‘new’ music audio files were added to streaming services in 2023. It is difficult to tell how many are AI-generated. This is in large part because the labelling of AI songs is not currently required by law — without such a requirement, it will be extremely difficult to gain a real understanding of the impact AI-generated music is having on various aspects of the music industry. Change is clearly needed.  

Consumers must know that the material they are listening to is AI-generated — AI is already able to create an almost exact likeness of an individual creator. It is alarming that, currently, an artist’s voice or image could be used as a potential deepfake to sell a product without their knowledge or consent.

Artists like Drake, Nick Cave and Johnny Marr have already spoken about their concerns regarding AI-generated fakes which deprive genuine creators of income. Taylor Swift is also reportedly deeply unhappy with these AI fakes and is demanding action to stop such exploitative abuses. She is right. We need to stop the bots.

Thankfully, UK and US artists have some redress, in particular in the form of protection against false endorsement. However, further clarity on the use of an artist’s image or voice by an algorithm should be provided by law to protect artists against this kind of misappropriation.

In the 1990s, the development of MP3s caused a boom in illegal downloading. The music industry knows from experience the damage that not getting ahead of emerging technology can cause to incomes. That is why we are continuing to talk to the Government to find enduring and practical solutions that both benefit individual creators and the UK more broadly.

These practical solutions include protecting the unassailable right of creators to decide if and how their work can be used, underpinned by existing copyright rules; proper record keeping so creators who have given consent know how and where music ingested by AI is used; proper labelling so everyone knows where music has been generated via AI; and protections for the personality and image rights of songwriters and artists.

Without such protections, it is not just the music industry that could suffer but also other creative industries, such as publishing, journalism, film, television and illustration.

Creative industries are a jewel in the UK’s crown. We need the Government to ensure AI technologies have the appropriate guard rails to allow its development in a positive way that does not undermine artistic talent or erode successful UK businesses, but instead helps them grow.

 

Lord Watson of Wyre Forest is Chair of the industry umbrella body UK Music. He is the former Deputy Leader of the Labour Party.

This article was published in the latest edition of Centre Write. Views expressed in this article are those of the author, and not necessarily those of Bright Blue. 

Read more from our June 2024 Centre Write magazine, ‘Generation AI?’ here.