skip to main content

MU Briefing: Artificial Intelligence, AI Training and AI Generated Music

A briefing from the MU on the urgent need to regulate AI, highlighting why government plans to rely on opt-outs are unfair and impractical. It explains how copyright is vital to the creative industries, why fair pay for creators has strong public backing, and outlines our key demands to government.

 

The Musicians’ Union (MU) is a Labour Party affiliated trade union representing over 36,000 musicians across the UK, working in all sectors of the music industry.

The MU is campaigning for specific measures to protect musicians and creators, working in the UK's multi-billion pound creative industries, from the impacts of generative artificial intelligence (AI).

Why AI needs to be regulated

Generative AI directly competes with the creative works it is trained on.

Harvard Business Review study in 2024 showed that the introduction of ChatGPT had reduced writing jobs by 30% and coding jobs by 20%, and the introduction of AI image generators reduced image creation jobs by 17%.

New research by Queen Mary University of London, the Institute for the Future of Work and The Turing Institute reveals this is already having an impact on pay, job security and finding work:

  • 55% of creative workers report diminished financial compensation for their work as a result of generative AI
  • 68% feel their job security is diminished as a result of generative AI
  • 61% report that the value of their work by others diminished as a result of generative AI.

Allowing AI companies to train models on copyrighted work without fair compensation undermines both the rights and the livelihoods of creators.

Government’s proposal to rely on opt outs is impractical and deeply unfair to creators

Evidence suggests that most people eligible for opt out schemes don't realise they can opt out and miss the chance to do so, meaning their work is used without permission or compensation.

In fact, two years into the mass adoption of generative AI, 60% of artists haven’t even heard of robots.txt, the most widely used opt out scheme.

Only 5% of creative workers know that their work has been used to train AI models and have given permission.

Relying on opt outs would disproportionately harm small creators. They are the least likely to reserve their rights and the most vulnerable to being out-competed by AI models trained on their work. Evidence from Cloudflare, who, analysing the top 1 million websites by number of visits, found that the percentage blocking AI crawlers increased with increasing website visits.

Copyright underpins the UK’s multi-billion pound creative industries

Weakening copyright protections as outlined in the government's proposal would jeopardise not only the livelihoods of countless creators but also the country’s broader economic interests.

The UK music industry is worth £7.6 billion to the UK economy and supports over 200,000 jobs.

Music is part of the wider creative industries, which are worth over £124bn to the economy - or over 14m an hour - and support 2.4m jobs.

Businesses in the creative industries make up close to 10% of all UK-registered businesses. Many of these are small and medium sized businesses; 78.2%have a turnover of less than £250,000.

Paying creators for their work has broad public support

Recent data from YouGov shows that 72% of the public believe creators should be paid by AI companies for the use of their work in training datasets.

This aligns with previous research from the AI Policy Institute in the US that shows that 74% of the public think AI companies should compensate creators for training, vs. 9% who think they shouldn't.

37,000 people, including many of the UK's most respected authors, actors and musicians, recently signed a Statement on AI Training that rejected unlicensed generative AI training - which is, somewhat inconceivably, precisely what the government is now proposing.

Five key things the government can do to protect musicians and creators

The Musicians’ Union is calling on the government to implement the following measures:

  1. Consent: tech firms should be legally required to uphold copyright and get explicit consent from human creators to use their works to train AI models
  2. Labelling: AI-generated works should be clearly labelled so people can choose what they listen to
  3. Fair remuneration: composers and performers should be guaranteed a fair share of the revenue from AI-generated music
  4. AI laws: AI firms should be required to keep a published and regularly updated record of the works they have used for training so that composers and performers can find out if their work has been used and take action; this must be combined with a clear obligation on AI firms to be receptive, fast-acting and proactive in confirming that action has been taken
  5. Publicity, personality and personal data rights: these rights must be strengthened to ensure the UK remains competitive on a global stage, and must belong to individuals so that they cannot be exploited by AI companies, major record labels or other rightsholders without explicit consent of the individual.