Tesla's AI can't see in fog — 3.2M vehicles face recall
NHTSA escalated its Tesla FSD investigation to 3.2 million vehicles after finding the AI driving system fails to detect fog and sun glare — the step before a recall.
The U.S. government's car safety agency just took the final step before potentially forcing Tesla to recall 3.2 million vehicles. The reason: Tesla's "Full Self-Driving" AI system can't reliably see the road when the sun is in its eyes or the weather turns foggy.
What NHTSA found — and why it's escalating
On March 18, 2026, the National Highway Traffic Safety Administration (NHTSA) — the U.S. agency responsible for vehicle safety — upgraded its Tesla FSD investigation from a "Preliminary Evaluation" to an "Engineering Analysis." In plain terms: this is the last stage before the government orders a recall.
The core finding is alarming. Tesla's FSD system relies entirely on cameras to see the road (no radar, no lidar). When those cameras are impaired by sun glare, fog, dust, or low light, the system is supposed to warn the driver and hand back control. But according to NHTSA, it doesn't — or at least, not in time.
NHTSA's key finding:
"The system did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred."
Translation: by the time the car told drivers there was a problem, it was already too late.
The crashes that triggered the probe
The investigation now covers 9 crashes where FSD was active in poor visibility conditions. The numbers:
• 9 total crashes identified in reduced visibility
• 1 fatality — a pedestrian struck on November 28, 2023
• 1 injury
• 6 more incidents still being examined
• 3.2 million vehicles potentially affected — including Model S, X, 3, Y, and Cybertruck
The investigation originally launched in October 2024 with just 4 crashes. NHTSA has since uncovered 5 more, plus 6 additional incidents that may also be related.
Tesla's response raised more questions
Two details stand out from Tesla's interactions with the safety agency.
First, a 7-month reporting delay. The fatal pedestrian crash happened in November 2023. Tesla didn't file the required safety report until June 27, 2024. The very next day, June 28, Tesla began developing a fix for the visibility detection system — suggesting the company knew the problem was serious.
Second, Tesla admits its own fix wouldn't help most of the crashes. Tesla's internal analysis conceded that its updated detection system "may have affected" only 3 of the 9 identified crashes. That means even Tesla's own engineers acknowledge their fix wouldn't have prevented 6 of the incidents.
NHTSA also flagged that Tesla may be under-reporting related crashes, citing "data and labeling limitations" in Tesla's records.
This is the third active federal probe into FSD
This isn't the only investigation Tesla's self-driving system is facing. The new Engineering Analysis (EA26002) joins at least two other open federal probes into FSD behavior — making it one of the most scrutinized AI systems in any consumer product.
Engineering Analyses typically conclude within 18 months and end in one of two ways: the case is closed, or a mandatory recall is issued. Given the fatality, the expanded scope, and Tesla's own admission that its fix is limited, many analysts expect a recall.
The bigger picture: AI driving vs. human driving
The irony is that while Tesla's AI struggles with visibility, Waymo's self-driving cars — which use cameras, radar, and lidar together — recently reported 92% fewer serious-injury crashes compared to human drivers across 170.7 million miles of rider-only driving.
The contrast highlights a fundamental design choice. Tesla bet on a camera-only approach (called "vision-only") to keep costs down. But cameras, like human eyes, struggle with glare and fog. Radar and lidar (a laser-based sensor that measures distance) work in conditions cameras can't.
For the 3.2 million Tesla owners affected, the question is now simple: how much do you trust your car's AI when conditions get tough? Until this investigation concludes, NHTSA's answer is clear — not enough.
Related Content — Get Started with Easy Claude Code | Free Learning Guides | More AI News
Stay updated on AI news
Simple explanations of the latest AI developments