Arm just built its own chip for the first time — after 35 years
Arm's 136-core AGI CPU is its first in-house chip ever. Meta, OpenAI, and Cloudflare are already customers. It delivers 2x the performance-per-watt of Intel and AMD.
For 35 years, Arm designed chip blueprints and let companies like Apple, Nvidia, and Amazon build the actual hardware. That era just ended.
Arm has unveiled the AGI CPU — a 136-core data center processor that the company designed, built, and will sell as finished silicon. It's the first time in the company's history that Arm has shipped its own production chip.
Meta co-developed the chip and is the lead customer. OpenAI, Cloudflare, SAP, Cerebras, SK Telecom, and Lenovo have also signed on.

What the AGI CPU actually does
The name "AGI" stands for the chip's purpose: powering the infrastructure behind AI agents. Think of it as the traffic controller in a data center — it coordinates AI accelerators (the specialized chips that run AI models), manages data flow, and handles the orchestration work that keeps everything running smoothly.
This is the CPU side of AI, not the GPU side. While Nvidia's chips do the heavy mathematical lifting, Arm's AGI CPU handles everything around it.
Key specs at a glance
- 136 cores across two dies, running up to 3.7 GHz
- TSMC 3nm process — the most advanced chip manufacturing available
- 300 watts — about half the power draw of competing x86 server chips
- 12 DDR5 memory channels delivering 825 GB/s bandwidth
- 96 lanes of PCIe 6.0 — the latest, fastest data connection standard
- 2x performance-per-watt compared to Intel and AMD x86 racks
Why this matters beyond server rooms
Arm's pitch is simple: you can fit twice as much computing power in the same space and power budget as an Intel or AMD rack. In a liquid-cooled configuration, a single rack can hold 45,696 cores.
For anyone using AI tools — ChatGPT, Claude, Gemini — this is the kind of hardware that makes those services faster and cheaper to run. When data centers can do more with less power, the cost savings eventually reach consumers.
Mohamed Awad, Arm's EVP of Cloud AI, called it "a clean sheet design" — meaning they didn't carry over any legacy baggage from older chip architectures. They started from scratch specifically for AI workloads.
A company reinvented
This is a seismic shift for Arm's business model. Until now, Arm made money by licensing its chip designs to other companies. Apple uses Arm designs in every iPhone. Amazon's AWS Graviton chips are Arm-based. So are Qualcomm's laptop processors.
Now Arm is competing directly with its own customers. It's a risky move — but with Meta, OpenAI, and Cloudflare already committed, the gamble appears to be paying off before the chip even ships.
Commercial systems from ASRock Rack, Lenovo, and Supermicro are already available to order, with broader availability expected in the second half of 2026.
What this means if you run a business
If you're evaluating cloud providers or on-premises servers, Arm-based options are about to become much more compelling. The 2x power efficiency translates directly to lower operating costs. Watch for AWS, Azure, and Google Cloud to offer AGI CPU instances — they'll likely be priced competitively against existing options.
Related Content — Get Started with Easy Claude Code | Free Learning Guides | More AI News
Stay updated on AI news
Simple explanations of the latest AI developments