A jury just found Meta designed Instagram to addict children
A Los Angeles jury found Meta and YouTube liable for deliberately designing addictive platforms targeting children. The $6M verdict could reshape 2,300 pending lawsuits — the biggest tech accountability moment since tobacco.
A Los Angeles jury just delivered the first verdict in U.S. history finding social media platforms are defective products — the same legal category as faulty car brakes or contaminated food. After 9 days of trial and 43 hours of deliberation, the jury found Meta and YouTube liable on every single count, concluding both companies acted with "malice, oppression, or fraud."
The total damages: $6 million — $3 million in compensatory damages plus $3 million in punitive damages. Meta was assigned 70% of the blame, YouTube 30%. The money is pocket change for companies worth a combined $3 trillion. The precedent is not.
"Bring them in as tweens" — what the jury saw
The plaintiff, identified as K.G.M. ("Kaley"), is now 20 years old. She started watching YouTube at age 6 and opened Instagram at age 9 — neither platform stopped her. Her longest recorded Instagram session lasted 16 hours straight. She developed anxiety, depression, body dysmorphia, and suicidal thoughts.
The jury saw internal Meta documents that revealed:
"If we wanna win big with teens, we must bring them in as tweens." — Internal Meta memo
4 million users under age 13 were on Instagram — roughly 30% of all American 10–12 year olds — despite a stated minimum age of 13.
Internal employees described Instagram as "like a drug" and referred to the company as "basically pushers."
Meta's own research found 11-year-olds were 4x more likely to return to Instagram than competing apps.
Mark Zuckerberg himself testified during the trial. When asked about underage users, he said he "always wished" Meta had moved faster on identifying them — but claimed the company reached "the right place over time."
The features the jury called "defective"
This case broke new legal ground by targeting platform design, not user content. That distinction is critical — it means Section 230 (the law that normally shields tech companies from liability for what users post) doesn't apply here. The jury evaluated specific features as if they were physical product components:
Infinite scroll — a feed that never ends, eliminating natural stopping points
Autoplay — videos that automatically start playing the next clip without asking
Algorithmic feeds — content ranked by what keeps you watching longest, not what's most relevant
Notification bombardment — constant alerts designed to pull you back into the app
Beauty filters — face-altering tools that distort self-image, especially harmful to young girls
Lead attorney Mark Lanier put it bluntly: "How do you make a child never put down the phone? That's called the engineering of addiction."
Why $6 million could cost them billions
The dollar amount is almost irrelevant. What matters is the legal precedent. This is the first of 2,300+ pending federal cases and 1,600+ consolidated plaintiffs — including school districts and state attorneys general from across the country.
Legal experts are comparing this to the 1990s tobacco litigation, which started with small verdicts and ended with a $206 billion industry settlement that fundamentally changed how cigarettes were marketed, sold, and regulated in America.
The timeline is accelerating. Just one day before this verdict, a separate New Mexico jury ordered Meta to pay $375 million for failing to protect children from sexual predators on its platforms. Another federal trial is already scheduled for June 2026.
Snap and TikTok — who were originally also defendants in this case — settled before trial. Meta and Google chose to fight. The jury decided they were wrong.
Both companies plan to appeal — here's what to watch
Meta said it "respectfully disagrees with the verdict" and is "evaluating legal options." Google called the case a "mischaracterization of YouTube," arguing it's "a responsibly built streaming platform, not a social media site."
But appeals take years. Meanwhile, the floodgates are now open. Every future jury in every future trial will know that a previous jury already found these companies liable. That changes the calculus for settlement negotiations dramatically.
One juror, Victoria, said after the verdict: "We wanted them to feel it... this was unacceptable."
What you can check right now
If you have children or teenagers using social media, here are concrete steps based on the specific features the jury flagged:
Instagram: Go to Settings → Time Spent → Set daily time reminders. Turn off autoplay in Settings → Account → Cellular Data Use.
YouTube: Open Settings → General → Turn off "Autoplay next video." Enable "Remind me to take a break."
Both platforms: Turn off all non-essential notifications. This removes the single biggest "pull back" mechanism the jury identified.
For parents: Both platforms offer Family Center (Meta) and Family Link (Google) parental controls — the jury heard evidence that these were added reactively, but they do work.
The bigger picture: a jury of ordinary people looked at the internal documents, heard the testimony, and concluded that Instagram and YouTube were engineered to be addictive — and that the companies knew it, and did it anyway. Whether or not the appeals succeed, that conclusion is now part of the public record.
Related Content — Get Started with Easy Claude Code | Free Learning Guides | More AI News
Stay updated on AI news
Simple explanations of the latest AI developments