AI for Automation
Back to AI News
2026-03-20AI regulationdeepfakeUKAI ethicsadvertising

UK bans AI ad that implied users could undress women

The UK's ad regulator banned a YouTube AI app ad for implying it could digitally remove women's clothing — part of a broader crackdown on AI deepfake tools.


The UK's advertising watchdog just banned a YouTube ad for an AI video-editing app after it implied users could digitally remove a woman's clothing. The ruling marks one of the first formal enforcement actions against AI apps marketing deepfake-style capabilities — and it comes as the UK government moves to criminalize such tools entirely.

AI video app ad banned by UK regulator for implying clothing removal

'Erase anything' — the ad that crossed the line

The ad, which ran on YouTube in January 2026, promoted PixVideo – AI Video Maker, an app made by a company called Saeta Tech. It showed a before-and-after image of a young woman: in the "before" shot, red scribble covered her midriff. In the "after" shot, the scribble was gone — revealing bare skin, including underneath her shorts.

The text at the bottom read "Erase anything" alongside a heart-eyes emoji. The app's tagline: "No creative boundaries."

Eight people reported the ad to the Advertising Standards Authority (ASA) — the UK's independent advertising regulator — complaining that it sexualized and objectified women.

The regulator's verdict: 'irresponsible and offensive'

The ASA agreed. In its ruling published March 18, 2026, the authority said the ad "reduced the woman to a sexual object" and "condoned digitally altering and exposing women's bodies without their consent."

The ad violated three sections of the UK's advertising code:

What the ad broke:

Rule 1.3 — Social responsibility

Rule 4.1 — Harm and offence

Rule 4.9 — Harmful gender stereotypes

Here's the key detail: PixVideo doesn't actually let users create nude images. The ASA acknowledged this. But the regulator ruled that the impression the ad created was enough — because it suggested such use was possible and desirable.

Saeta Tech pulled all ads — and launched an internal review

Saeta Tech told the ASA it had already removed the ad and voluntarily suspended all advertising across every platform while it conducts an internal audit of its marketing materials. The company acknowledged the ad "risked implying uses they did not support or allow."

The ad must not appear again in its current form.

Why this matters: the UK is criminalizing AI 'nudification'

This ruling didn't happen in a vacuum. The UK government has been escalating its crackdown on AI-generated deepfakes — especially tools that digitally undress real people.

In February 2026, new criminal offences under the Data Use and Access Act made it illegal to create explicit deepfakes (AI-generated sexual images of a real person) without consent. These rules were fast-tracked after the Grok AI controversy, when X's AI chatbot was caught generating non-consensual intimate images.

On top of that, the Crime and Policing Bill — currently in the House of Lords — will make it a criminal offence to create or distribute AI "nudification" tools (apps specifically designed to remove clothing from photos of real people). This targets the providers, not just individual users.

And new rules will require social media platforms to take down abusive AI-generated images within 48 hours of a single report.

What AI app makers should learn from this

The ASA's ruling sends a clear signal: even if your AI app can't actually create explicit content, marketing it in a way that suggests it can is enough to get banned.

For anyone building or marketing AI image and video tools, this case sets a precedent. The "creative freedom" angle — letting users imagine anything is possible — is exactly what regulators are watching for. Perception matters as much as capability.

For everyday users, the message is simpler: governments are starting to draw hard lines around what AI can and can't do with people's images. The UK is leading on enforcement, but the EU's AI Act and proposed US state laws are moving in the same direction.

Related ContentGet Started with Easy Claude Code | Free Learning Guides | More AI News

Stay updated on AI news

Simple explanations of the latest AI developments