He used AI to make thousands of songs — and stole $8M from Spotify
A North Carolina man just pled guilty in the first-ever U.S. criminal case for AI music streaming fraud — 661,000 fake streams per day, billions total, $8M stolen.
A man from North Carolina just admitted to using AI to generate hundreds of thousands of fake songs, streaming them billions of times through bot accounts, and pocketing $8 million in stolen royalties. It's the first criminal case of its kind in the United States — and it shows how AI is being weaponized against the music industry.
Michael Smith, 54, of Cornelius, North Carolina, pleaded guilty on March 19, 2026, to conspiracy to commit wire fraud in a New York federal court. He faces up to five years in prison and must forfeit the entire $8 million. Sentencing is set for July 29, 2026.
How the scam worked — in plain English
Here's what Smith did, step by step:
Step 1: He used AI music generators to create hundreds of thousands of songs. These weren't hit records — just enough to pass as real music on streaming platforms.
Step 2: He uploaded them to Spotify, Apple Music, Amazon Music, and YouTube Music using digital distribution services.
Step 3: He created thousands of fake bot accounts on each platform — automated listeners that streamed his AI songs 24/7.
Step 4: By spreading the fake streams across thousands of accounts and songs, he stayed under each platform's fraud detection radar.
At peak operation, Smith's bot army was generating approximately 661,440 streams per day — earning him over $1.2 million per year in royalties. The money came directly from the same pool that pays real artists, meaning every dollar Smith stole was a dollar taken from actual musicians.
How he got caught
The Mechanical Licensing Collective (MLC) — the U.S. organization that collects and distributes digital royalties for songwriters — spotted the fraud first. Smith's artists were "digital ghosts": they had massive streaming numbers but zero social media presence, no press coverage, no concert history, and no fans.
Streaming platforms then confirmed the suspicious patterns: sudden spikes in plays with no corresponding real-world activity, streams that didn't match any promotional campaigns, and listening patterns that looked robotic rather than human.
Why this matters for creators and listeners
If you're a musician or podcaster, this case is a warning sign. AI makes it trivially easy to flood streaming platforms with fake content. Every fake stream dilutes the royalty pool — meaning real artists get paid less.
If you're a Spotify or Apple Music subscriber, your monthly fee is partially funding these payouts. When fraudsters game the system, your subscription money goes to bots instead of the artists you actually listen to.
For AI developers and policymakers, this is the first criminal conviction, but it won't be the last. Court documents reference unnamed co-conspirators, including an AI music company CEO and a music promoter — suggesting a broader network that prosecutors may still be investigating.
The first domino in a bigger crackdown
Smith originally faced three felony counts — wire fraud, wire fraud conspiracy, and money laundering conspiracy — each carrying up to 20 years. His plea deal reduced the charges to a single conspiracy count with a five-year maximum.
But prosecutors made it clear: this is the beginning, not the end. As AI music tools become more sophisticated, the barrier to pulling off this kind of fraud drops to nearly zero. Anyone with an AI music generator and some basic programming knowledge could theoretically replicate Smith's scheme.
The music industry is now racing to build better detection systems. But with AI-generated songs becoming increasingly indistinguishable from human-made music, the question isn't whether this will happen again — it's how many times it's already happening right now.
Related Content — Get Started with Easy Claude Code | Free Learning Guides | More AI News
Stay updated on AI news
Simple explanations of the latest AI developments