Micron's revenue nearly tripled — and it still can't make enough AI chips
Micron reported $23.86B in quarterly revenue — nearly 3x last year — but says it can only supply half of what AI customers need. Memory chips are now the bottleneck.
The company that makes the memory chips inside every AI server just posted one of the most jaw-dropping earnings reports in semiconductor history — and the takeaway isn't how much money it made. It's that even tripling production isn't enough to keep up with AI demand.
Micron Technology reported $23.86 billion in quarterly revenue, nearly three times the $8.71 billion it made in the same quarter last year. Adjusted earnings per share hit $12.20, beating Wall Street expectations by over 30%. And yet, the stock dropped — because Micron announced it needs to spend $5 billion more than planned just to keep up.
The numbers behind the AI memory boom
Every AI model — from ChatGPT to Claude to Gemini — runs on specialized chips called GPUs (the processors that handle AI calculations). But GPUs are useless without HBM (High Bandwidth Memory), the ultra-fast memory chips stacked directly on top of GPU processors. Think of it like this: the GPU is the engine, but HBM is the fuel tank. No memory, no AI.
Micron is one of only three companies on Earth that makes HBM — alongside Samsung and SK Hynix in South Korea. All three are completely sold out through the end of 2026.
Micron Q2 2026 — by the numbers
Revenue: $23.86B (vs $8.71B a year ago — up 174%)
Earnings per share: $12.20 (beat estimates by 30%+)
Gross margin: 74.9% (next quarter projected: 81%)
Cloud memory revenue: $7.75B (up 160%+)
Mobile/PC memory: $7.71B (from $2.24B a year ago)
Next quarter forecast: $33.5B (Wall Street expected $24.29B)
New capex plan: $25B (raised from $20B)
Why AI companies can't get enough memory
Micron's CEO Sanjay Mehrotra delivered a striking admission: the company can only supply 50% to two-thirds of what its key customers are asking for. In other words, the biggest AI companies — the ones building data centers for OpenAI, Google, Meta, and Microsoft — are placing orders Micron physically cannot fill.
The bottleneck is HBM manufacturing. Micron's latest HBM4 chips achieve bandwidth exceeding 2.8 terabytes per second — 2.3 times faster than the previous generation — while using 20% less power. These chips are designed specifically for Nvidia's newest AI platform, codenamed Vera Rubin.
Industry analysts estimate the HBM market will grow from $35 billion in 2025 to $100 billion by 2028. That's nearly a 3x increase in three years — and it explains why Micron raised its capital spending plan by $5 billion to build new factories in Idaho and New York.
The stock dropped anyway — here's why
Despite the blockbuster results, Micron's stock fell after the earnings call. The reason? Investors were spooked by the $25 billion capital expenditure plan. Building chip factories (called "fabs") is extraordinarily expensive, and spending $5 billion more than expected eats into the profits investors were counting on.
Still, the stock is up more than 350% over the past year — one of the best-performing tech stocks of 2025-2026. The company's gross margin of 74.9% is extraordinary for a hardware manufacturer (Apple's is about 46%), and it's projected to hit 81% next quarter.
What this signals about the AI industry
Micron's results reveal something important: the AI boom is no longer just a software story. The physical infrastructure — chips, memory, power, cooling — is becoming the real constraint. When the CEO of a $200+ billion company says he can only deliver half of what customers want, it means the entire AI industry is supply-limited, not demand-limited.
For investors and job seekers: Memory chip manufacturing is one of the fastest-growing sectors in tech. Micron alone is building two major new facilities — one in Boise, Idaho (production expected 2027-2028) and one in New York (completion by 2030). These projects will create thousands of jobs.
For AI users: The memory shortage is one reason AI services occasionally slow down or limit usage. As companies race to build more data centers, the physical components are the gating factor — not the software.
Samsung plans to triple its HBM production capacity by Q4 2026, and SK Hynix claims to have secured orders for its entire 2026 HBM output. But with AI demand growing faster than anyone predicted, the "sold out" sign on AI memory chips isn't coming down anytime soon.
Related Content — Get Started with Easy Claude Code | Free Learning Guides | More AI News
Sources
Stay updated on AI news
Simple explanations of the latest AI developments