This app runs AI on your phone — offline, no cloud, 95% smaller
Multiverse Computing's CompactifAI shrinks AI models by 95% using quantum math so they run on your phone — fully offline, with only 2-3% accuracy loss.
What if you could run ChatGPT-level AI on your phone — without internet, without sending your data anywhere, and without a $20/month subscription? CompactifAI, a new app from Spanish startup Multiverse Computing, just made that possible.
Quantum math makes giant AI models phone-sized
The core breakthrough is model compression — making AI models dramatically smaller without losing their intelligence. Most compression techniques lose 20-30% accuracy when they shrink a model significantly. CompactifAI uses quantum-inspired mathematics (advanced math borrowed from quantum computing research) to compress AI models by up to 95% while keeping accuracy within a 2-3% margin of the original.
To put that in perspective: imagine taking a 500-page encyclopedia and condensing it into a 25-page booklet — and the booklet still gets 97% of the answers right.
- Up to 95% compression — models become 20x smaller
- Only 2-3% accuracy loss — vs. industry standard of 20-30% loss
- Real-world proof: built a road hazard detection system with 83% compression and zero accuracy loss
- Works with models from OpenAI, Meta, DeepSeek, and Mistral
How the app actually works
The CompactifAI app looks and feels like ChatGPT or Claude — you type a question, you get an answer. The difference is what happens behind the scenes:
Complex questions ("Analyze this 10-page document", "Write a detailed business plan") → routed to cloud models via API for more processing power
The app's built-in model is called Gilda — small enough to run on a phone, smart enough for most daily tasks. When it hits its limits, it seamlessly switches to more powerful cloud models.
Who should care about offline AI?
Privacy-conscious professionals
Lawyers, doctors, and consultants who handle sensitive information can now use AI without any data leaving their device. No cloud servers, no data logging, no third-party access.
People in low-connectivity environments
Field workers, travelers, or anyone in areas with spotty internet can use AI without a connection. The model lives on your phone.
Businesses in regulated industries
Healthcare, finance, and government organizations with strict data sovereignty requirements can deploy AI without compliance headaches.
Try the compressed models yourself
Multiverse Computing has published their compressed open-source models on Hugging Face. They also launched a self-serve API portal on March 19, letting developers access compressed versions of models from OpenAI, Meta, DeepSeek, and Mistral.
On March 19, TechCrunch reported that the company is pushing these compressed models into the mainstream with the new API portal — a signal that what started as niche research is becoming a real product.
The big picture
Right now, almost all AI runs in massive data centers owned by a few companies. CompactifAI represents a different future — one where powerful AI runs locally on the devices you already own, your data never leaves your pocket, and you don't need a subscription to use it.
Related Content — Get Started with Easy Claude Code | Free Learning Guides | More AI News
Stay updated on AI news
Simple explanations of the latest AI developments