AI for Automation
Back to AI News
2026-03-21AI surveillancedata privacyStravaOSINTmilitary securityfitness apps

A 36-minute jog just exposed a nuclear aircraft carrier

A French sailor's Strava run revealed the Charles de Gaulle's exact position. AI-powered spy tools make this kind of leak far more dangerous than it was in 2018.


On March 13, a young French Navy officer laced up his running shoes aboard the Charles de Gaulle — France's only nuclear-powered aircraft carrier — and went for a 7.23-kilometer jog around the flight deck. He tracked it on Strava, the popular fitness app. His profile was set to public.

Within hours, the entire world could see exactly where France's most sensitive warship was sitting: northwest of Cyprus, roughly 100 kilometers from the Turkish coast, heading toward the Middle East on a mission President Macron had announced just 10 days earlier.

French aircraft carrier Charles de Gaulle at sea

How a jog became a national security breach

The French newspaper Le Monde broke the story. Journalists spotted the officer's activity — a back-and-forth route barely 300 meters wide, tracing the outline of a flight deck, floating in the middle of the sea. They cross-checked it with satellite imagery taken 90 minutes later. The carrier and its escort vessels were right there.

But it gets worse. By scrolling through the officer's public Strava history, Le Monde tracked the entire carrier group's movements for months: off the coast of Cherbourg in February, docked in Copenhagen in late February, and into the eastern Mediterranean by mid-March. A single fitness profile mapped a nuclear warship's entire deployment route.

The French Armed Forces General Staff called it a breach of operational security rules and said it had identified "several service members" who shared location data online.

Why AI makes this 100 times more dangerous

In 2018, researchers first discovered that Strava's global heatmap exposed U.S. military bases in Iraq, Syria, and Afghanistan. That discovery required a human analyst to notice glowing jogging paths in the desert.

Today, AI-powered OSINT tools (open-source intelligence — software that automatically collects and analyzes publicly available data) do this at industrial scale. State intelligence agencies, particularly Russian and Chinese cyber units, systematically scrape fitness app data and cross-reference it with satellite imagery, ship-tracking databases, and social media posts — automatically, around the clock.

The real-world cost: In 2024, a former Russian submarine commander was assassinated, reportedly with the help of information obtained from his public Strava profile. AI tools can correlate a single fitness profile with thousands of other data points — turning a casual jog into a targeting package.

It's not just soldiers — it's you too

If you use Strava, Garmin Connect, Apple Fitness, or any GPS-based workout tracker, you're generating the same kind of data. AI-powered advertising systems already use this to target you with location-based ads. Data brokers sell aggregated fitness data to anyone willing to pay — as the FBI recently admitted doing.

If you're a runner, cyclist, or gym-goer who uses fitness apps, here's what you should do right now:

  • Set your Strava profile to private (Settings → Privacy Controls → Profile Page → "Followers Only")
  • Enable hidden start/end points to mask where you live and work
  • Turn off Flyby so strangers can't see your real-time location
  • Review which apps have access to your location data and revoke anything unnecessary

Eight years of warnings — and militaries still can't fix it

The U.S. Department of Defense banned fitness trackers in operational areas in 2018. France has issued repeated security reminders. Sweden's Prime Minister's bodyguard detail was compromised through Strava in 2024. And yet, here we are in 2026, with a nuclear aircraft carrier exposed by a morning run.

The uncomfortable truth: no military has figured out how to enforce a fitness app ban across hundreds of thousands of service members who carry internet-connected devices everywhere. And as AI surveillance tools get smarter, every public data point becomes more valuable — and more dangerous.

Related ContentGet Started with Easy Claude Code | Free Learning Guides | More AI News

Stay updated on AI news

Simple explanations of the latest AI developments