Inside the Music Industry's Battle with the UK Government over AI Song Generators
Universal Music Group has been asking music streaming services like Spotify to stop developers from scraping its material to train AI bots to make new songs. The label, which controls about a third of the recorded music industry, has also been issuing substantial numbers of takedown requests in relation to AI uploads appearing online.
It is the latest move in the music industry’s growing battle to prevent AIs from using its songs without licensing them. Behind these efforts to enforce copyright, the big worry is about how governments will balance the rights of AIs against human creativity.
In particular, the UK government is threatening to water down copyright laws to benefit tech companies at the expense of not only the music industry but also creative businesses like literature, films and photography. So what’s going on?
AI music and copyright
On a “royalty free music generator” like Mubert, it’s already possible to type in a prompt and the programme will use AI to search a catalogue of music for patterns. Tell it to play a “fast voodoo rhythm in the style of a nursery rhyme with some pretty electronics”, and it will copy parts of songs that correspond and generate music to match. You can also generate music that sounds like a particular artist, and whatever tracks you create are downloadable.
Mubert claims to be “on a global mission to empower creators”. It is unclear how that squares with not paying human creators royalties for the use of their music. Mubert even emphasises that its audio material is made “from real musicians and producers”, recognising that the value in the music is coming from human creators.
Music is protected by copyright law, which means that anyone wanting to use a song has to pay a licence. This ensures that rightsholders and creators are paid properly for their creativity. For example, Spotify pays a licence to record labels and artists to put music on its platform. The same is true of everyone from bars, cafes and pubs playing records for their customers to artists sampling someone else’s song in their new track.
If AI programmes are using labels’ music catalogues without permission, they could be seen to have infringed music rights in at least two ways: by using the music to train the AIs, and in copying parts of the music that the AI produces from the training data.
If the streaming platforms were seen to have facilitated such illegal activity, they could be found guilty of secondary copyright infringement, comparable to an illegal downloading platform like The Pirate Bay.
Unfortunately for the music industry, the UK government has been muddying the waters with proposals to change the copyright rules to benefit tech companies. A few months ago, it floated the idea of making an exception for the first type of infringement: using music catalogues as training data. This would also apply to other artistic works like videos and photographs.
There are already copyright exceptions in the UK where permission for reuse is unnecessary, such as “criticism, review or quotation”, though there are limitations to make sure this is done fairly.
When governments want to create a new exception, they must follow three requirements set out in the Berne convention. It must be for very specific special circumstances, must not interfere with the normal exploitation of the work and must not unreasonably prejudice the rightsholder. In my view, the UK proposal doesn’t meet any of these steps and would be contrary to international law.
The battle for the UK
The proposed exception met with widespread objections, with only 13 out of 88 responses to the consultation in favour. The House of Lords Communications and Digital Committee said the proposal is “misguided” and should be scrapped. The government did then appear to backtrack in February, with science minister George Freeman saying it would not take the exception forward.
In March, however, it published a white paper, A Pro-Innovation Approach to AI Regulation, which raised the prospect that it might be reviving its previous approach. The white paper wants to prioritise making the UK a tech-friendly environment, emphasising “the role of regulation in creating the environment for AI to flourish”. It mentions risks to things like mental health, privacy rights and human rights, but not any threats to intellectual property (IP).
This comes at a time when governments around the world and international organisations such as the World Intellectual Property Organization are considering how laws need to adapt to AI. Japan and Singapore are already introducing copyright exceptions along similar lines to those being discussed in the UK. This is also a major concern for the creative industries, but not to the same extent as the UK, which tends to be particularly influential in IP law around the world.
There are no proposals for copyright exceptions in the US or the EU. Indeed the US IP laws are currently being tested by photographic giant Getty Images against an AI operator called Stability Diffusion, which has been scraping its images to generate new ones. US copyright has a “fair use” exception which could potentially be a defence for these operators, so Getty wants confirmation that is not the case. It has also filed a case along the same lines in the UK, which is at an earlier stage.
This all boils down to whether we still believe human creativity deserves greater protection than machine creativity. Appealing to tech might seem like a good strategy for the UK, but the creative industries contribute hugely to the economy – £109 billion in 2021, or nearly 6% of total GDP.
The value of music also goes beyond raw economics, offering emotional comfort, health benefits and even inspiring social, political and economic change. The creators should arguably be rewarded for this too, whether they are responsible for composing music directly or providing the material that AI repurposes.
Copyright law is supposed to ensure that creators are fairly remunerated for their work. When it brings such value to the world, it seems like a strong argument for protecting it.
-
Previous:
-
Next: