AI for Automation
Back to AI News
2026-03-19AI copyrightUK governmentAI regulationcreatorsmusic industry

UK drops AI copyright plan after 1,000 artists fought back

The UK government abandoned its plan to let AI companies train on copyrighted work without permission, after a massive protest from Kate Bush, Elton John, and 1,000+ musicians.


The UK government just reversed course on one of the most controversial AI policies in Europe. On March 18, Technology Secretary Liz Kendall confirmed the government no longer supports its plan to let AI companies use copyrighted creative works for training — unless creators specifically opted out.

The original plan would have shifted the burden to individual artists: you'd have to tell every AI company, one by one, not to use your work. If you didn't? Fair game. After 11,500 public responses and a wave of celebrity-led protests, the government said it has "no preferred option" and won't change copyright law "until we are confident" it meets their goals.

How artists killed the opt-out plan

The backlash was extraordinary. In February 2025, over 1,000 musicians — including Kate Bush, Annie Lennox, Hans Zimmer, Damon Albarn, Cat Stevens, and Imogen Heap — released a silent album called "Is This What We Want?" The album featured recordings of empty studios and performance spaces. The tracklisting itself spelled out a message: "The British government must not legalise music theft to benefit AI companies."

The star power was immense: Sir Paul McCartney, Lord Lloyd Webber, Sir Stephen Fry, Ed Sheeran, Dua Lipa, and Sting all signed an open letter to The Times warning the government was letting "big tech raid the creative sectors." Sir Elton John went on the BBC and called the government "losers," comparing AI training on copyrighted works to "thievery on a high scale."

The protest was organised by Ed Newton-Rex, who called the proposals "disastrous for musicians" and "totally unnecessary." Profits from the silent album went to the charity Help Musicians.

What was the original plan?

In December 2024, the UK government proposed a new rule: AI companies could scrape and use any copyrighted content they found online for training purposes. Creators who didn't want their work used would need to "opt out" using machine-readable formats — technical tags that tell AI crawlers to skip their content.

The problem? It's practically impossible for an individual writer, musician, or artist to contact thousands of different AI companies and monitor whether their work is being used across the entire internet. Critics called it like asking someone to lock every door in a city, one by one, instead of making burglary illegal.

The government argued this would balance the UK's "world-leading creative industries" with its AI sector, which it said was growing 23 times faster than the rest of the economy.

What the creative industry is saying

Tom Kiehl, CEO of UK Music: Called it "a major victory for campaigners" and said industry workers should be able to work "without the constant fear that the fruits of their labour could effectively be taken by AI firms without payment or permission."

Mandy Hill, Publishers Association: Called it a victory "over the self-interest of a handful of large corporations" — but warned the government has not entirely ruled out allowing AI companies to use copyrighted content without a license in the future.

Anthony Walker, TechUK: Disappointed, saying "the UK cannot afford for this to remain unresolved" and that international competitors are "moving ahead."

Why this matters beyond the UK

This isn't just a British issue. No country has found a stable solution to the question of whether AI companies should be allowed to train on copyrighted work without paying for it.

The EU has a similar opt-out approach through its text and data mining rules, but implementation has been inconsistent. In the US, major court cases like The New York Times v. OpenAI are still unresolved. Just last week, Anthropic agreed to a $1.5 billion copyright settlement after a judge ruled that copying pirated books didn't qualify as fair use.

The UK's reversal sends a clear signal: the opt-out model is politically toxic. Creative industry coalitions — with their celebrity firepower and public sympathy — proved more powerful than tech lobbyists' arguments about economic competitiveness. Other governments are watching.

What happens next

For now, the UK is in limbo. The government says it needs more time to find a solution that works for everyone. The Musicians' Union is pushing for collective licensing schemes (think of it like a music royalty system, but for AI training) that would protect individual artists, not just major labels.

UK music contributed a record £7.6 billion to the economy in 2023. The creative sector isn't going away — and neither is the question of how AI companies should pay for the human creativity that makes their products work.

Related ContentGet Started with Easy Claude Code | Free Learning Guides | More AI News

Stay updated on AI news

Simple explanations of the latest AI developments