AI for Automation
Back to AI News
2026-05-12AI age verificationGUARD ActRightsCon 2026digital rightsAI regulationEFFChina censorshiponline privacy

AI Age Verification Bill Advances as China Kills RightsCon

RightsCon 2026 cancelled under Chinese pressure. Congress's GUARD Act now mandates real ID for AI companions — EFF warns it's a surveillance system.


AI regulation and digital rights converged in a single week of crises: the world's largest digital rights conference was cancelled by foreign government pressure, while U.S. Congress advanced the GUARD Act requiring your real-world identity as a prerequisite for chatting with an AI companion. Separated by 7,000 miles, these two events share the same underlying logic: restrict access to open conversation by controlling who is allowed to participate — and under what conditions.

RightsCon 2026, scheduled for Lusaka, Zambia, was cancelled just days before its opening. Participants had already arrived. Congress, meanwhile, narrowed the GUARD Act — once a sweeping regulation touching nearly every AI chatbot — to target only "AI companions." The narrowing sounds like progress. The EFF (Electronic Frontier Foundation, the largest digital civil liberties organization in the world) and more than 18 allied groups argue it is not.

RightsCon 2026: The Conference China Cancelled

EFF (Electronic Frontier Foundation) logo — digital rights organization opposing AI age verification surveillance laws

RightsCon has operated since 2011 — making 2026 its 15th year. It functions as one of the only global forums where human rights defenders, technology researchers, investigative journalists, and policy advocates from every region of the world can meet face-to-face without government intermediaries. Civil society organizations (non-governmental groups working on human rights, press freedom, and public interest technology) treat it as critical infrastructure for cross-border coordination that cannot be replicated through encrypted messaging apps or virtual summits.

The 2026 edition was scheduled for Lusaka, Zambia. Days before it was set to open — with participants already on the ground — the Zambian government cancelled it. The government acted under direct pressure from China, which had demanded two conditions that organizers considered incompatible with the conference's core mission:

  • Taiwanese participants be excluded from attending
  • Conference organizers moderate or restrict discussions that the Chinese government considers politically sensitive

RightsCon organizers refused. The conference was cancelled outright — not rescheduled, not relocated. The response from the global digital rights community was immediate and unanimous:

  • Access Now called it "the far reach of transnational repression targeting civil society"
  • Index on Censorship warned of "a dangerous escalation in attempts to suppress open dialogue"
  • IFEX — the global freedom of expression network — called it "a blow not just to one conference, but to freedom of expression and assembly everywhere"
  • The U.N.'s World Press Freedom Day ceremony was scaled down during the same period; its press freedom prize postponed

Diplomatic Leverage as a Censorship Tool

What distinguishes this situation is the mechanism. Transnational repression (when one country uses another government's systems to silence dissent beyond its own borders) typically involves surveillance, threats against family members, or legal systems weaponized against diaspora communities. This case used a blunter instrument: direct economic and diplomatic pressure on a host government.

Zambia maintains significant financial relationships with Chinese lenders and infrastructure investors — relationships that create structural leverage without requiring a single law or arrest. China communicated that hosting an unrestricted RightsCon carried costs. Zambia's government complied. No website was blocked. No participant was detained.

For digital rights advocates, this is especially damaging because RightsCon specifically serves communities already living under digital repression. Activists from countries with state surveillance infrastructure, journalists covering authoritarian regimes, and researchers documenting government censorship all rely on in-person coordination that cannot be monitored through home-country digital channels. Remove the conference, and you remove one of the few infrastructure-independent ways these communities can organize globally.

The EFF and allied organizations plan to continue work through the Global Gathering and FIFAfrica events — regional alternatives operating in smaller, distributed formats that are harder to cancel through single-host-country pressure. But as the EFF notes, these are not substitutes for a 15-year institution with established global participation across every region and issue area.

The GUARD Act: Narrower in Scope, Unchanged in Method

RightsCon 2026 Lusaka Zambia — digital rights conference cancelled after Chinese government pressure

Across the Atlantic, the GUARD Act was moving through revision. In its original form, it would have applied to nearly every AI-powered conversational tool — chatbots, AI search engines, writing assistants. After sustained criticism from the digital rights community, Congress narrowed it to target only "AI companions": systems that maintain a persistent identity (a consistent named persona rather than a fresh session each time) and simulate emotional or social interactions with users.

The revised scope removes millions of tools from compliance requirements. But the enforcement mechanism — mandatory age verification (a process requiring users to prove their real age using real-world identity documents before accessing a service) tied to financial records, mobile OS accounts, or app store credentials — remains entirely unchanged. The EFF and more than 18 partner organizations argue that this mechanism is itself the core problem, regardless of how narrowly "AI companion" is defined.

AI Age Verification Is Surveillance Architecture — By Design

Age verification systems require identity-linked data. There is no privacy-preserving method to confirm someone's age without either trusting a credentialed third party or collecting identity data yourself. The GUARD Act as written requires verification through financial records, mobile OS accounts (such as Apple ID or Google account, both tied to a legal name and payment method), or app store credentials — all of which permanently link a person's online activity to their real-world identity.

The EFF's documented position: age verification systems are surveillance systems (databases that record who you are and which services you access, when). Once built, such databases become high-value targets for data breaches, government subpoenas, and commercial data aggregation. The question is not whether they will be compromised, but when.

The practical exclusions fall hardest on communities that already face the most barriers:

  • Millions of Americans lack current government-issued photo ID, disproportionately including elderly people, low-income households, and highly mobile populations
  • Unbanked households — an estimated 5.9 million U.S. households with no bank account — cannot use financial records for verification
  • Undocumented residents cannot engage legal identity systems without exposure risk
  • Privacy-conscious teenagers — the exact population the bill aims to protect — become the most exposed to centralized databases logging their AI service usage
  • Parents who want to allow their children to use these tools still face mandatory compliance requirements, removing family-level choice

A second problem is definitional. The bill's definition of "AI companion" remains vague at the margins. Customer service bots with conversation memory, mental health apps, and educational tutoring systems could all fall within scope depending on how regulators interpret "persistent identity simulating emotional interactions." Congress simultaneously left the definition unclear and sharply increased penalties for companies that misinterpret it — a combination that incentivizes over-restriction. Businesses facing that trade-off will restrict access far beyond what legislators intended rather than risk liability.

The depth of concern in technical communities is visible in the numbers: the EFF's Age Verification Hub post generated 375 upvotes and 355 comments on Hacker News, a platform heavily used by software developers and security researchers who understand the systemic implications of building identity-linked access infrastructure at scale.

Two Stories, One Pattern

RightsCon's cancellation and the GUARD Act's advancement are not the same kind of event. One is geopolitical pressure closing a physical meeting space; the other is domestic legislation creating identity requirements for digital access. But both use protective framing — diplomatic courtesy in Zambia, child safety in Washington — to restrict who can participate in open conversation. And both disproportionately affect the communities with the least institutional power to push back.

The EFF frames both as part of a global pattern of shrinking civic space: the gradual reduction of physical and digital forums where civil society can organize freely, through mechanisms that are technically legal and rarely visible until the space is already gone. Whether the mechanism is Chinese diplomatic pressure on a Zambian conference venue or congressional age verification requirements on an AI chat service, the operational result is the same — fewer people have access to open conversation, and the ones excluded are rarely the ones who can afford to appeal the decision.

You can track the GUARD Act's next legislative steps and the digital rights community's response at the EFF's age verification resource hub. To understand how AI regulation may affect the tools you use daily, explore the AI regulation guides at AI for Automation.

Related ContentGet Started | Guides | More News

Stay updated on AI news

Simple explanations of the latest AI developments