AI for Automation
Back to AI News
2026-03-19AI regulationfederal AI billcopyrightSection 230AI policy

The first federal AI bill just dropped — here's what it bans

Senator Blackburn's draft AI bill says training on copyrighted work isn't fair use, kills Section 230, and requires quarterly job loss reports.


Senator Marsha Blackburn (R-Tenn.) just unveiled the first discussion draft of a comprehensive federal AI bill — and it could reshape how every AI company in America operates. The bill establishes a national "duty of care" requiring AI developers to prevent and mitigate foreseeable harm to users.

If this becomes law, it won't just affect OpenAI or Anthropic. It will change how you interact with every AI tool you use.

Senator Blackburn federal AI bill discussion draft

AI training on copyrighted work? No longer "fair use"

The single biggest bombshell in this draft: using copyrighted content to train AI models is explicitly not fair use under the Copyright Act. That one sentence could upend the entire foundation that companies like OpenAI, Google, and Meta have built their models on.

Right now, most AI companies argue that scraping books, articles, images, and music to train their models counts as "fair use" — a legal doctrine that allows limited use of copyrighted material without permission. This bill says: no, it doesn't.

For creators — writers, artists, musicians, photographers — this is potentially the biggest legal win in the AI era. For AI companies, it means licensing deals or lawsuits.

Section 230 is gone

The draft also proposes eliminating Section 230 protections for tech platforms. Section 230 is the law that currently shields platforms like Facebook, X, and YouTube from being sued over content their users post. Without it, platforms could be held legally responsible for AI-generated content that causes harm.

This is a massive shift. If a chatbot gives dangerous medical advice or generates defamatory content, the company behind it could face lawsuits — with no legal shield.

Your voice and face are now protected

Key protections in the bill:

  • Biometric defense: It's illegal to create digital replicas of someone's voice or face without their consent
  • Child safety: Platforms must implement safeguards protecting users under 17 from AI-generated harm
  • Transparency: Federal guidelines for marking, authenticating, and detecting AI-generated content
  • Job tracking: Companies must file quarterly reports with the Department of Labor on AI-related job displacement
  • Bias audits: Third-party audits to prevent discrimination — including based on political affiliation

Quarterly job displacement reports

Here's a provision that hasn't gotten enough attention: companies must report to the government every three months how many jobs their AI systems have displaced. This is the first time any federal legislation has proposed mandatory, recurring transparency on AI's impact on employment.

This comes at a particularly sensitive moment. A recent poll found that 60% of Americans want AI companies to pay for lost jobs. Senator Blackburn's bill doesn't go that far — but forcing companies to publicly report displacement numbers is a significant first step.

Who wins and who loses

Winners:

Content creators, artists, musicians, and writers who've argued their work was used without permission or compensation. This bill hands them the legal ammunition they've been waiting for.

Losers:

AI companies that trained models on massive datasets of copyrighted content without licensing it. If this passes, they'll either need to negotiate deals retroactively or retrain from scratch on licensed data.

Don't celebrate (or panic) yet

This is a discussion draft — the very first step in a long legislative process. It was created in response to President Trump's December executive order directing Congress to establish AI guardrails. Significant negotiations, amendments, and compromises will follow before anything becomes law.

But the direction is clear: Washington is moving from "let AI innovate freely" to "AI companies have responsibilities." The EU already passed its AI Act. Now the U.S. is catching up.

For everyday AI users, the practical impact will take time. But if you're a creator whose work has been used to train AI models, this bill is the first serious sign that the law might finally be on your side.

Related ContentGet Started with Easy Claude Code | Free Learning Guides | More AI News

Stay updated on AI news

Simple explanations of the latest AI developments