Building AI chips now produces more CO2 than some countries
Manufacturing emissions from AI GPU production will rise 12x by 2030, reaching 21.6 million metric tons of CO2 — and the memory chips inside them are the biggest culprit.
The AI boom has a pollution problem. Manufacturing the chips that power AI models is on track to produce 247 million metric tons of CO2 by 2030 — more than the entire country of Algeria emits in a year. That's a one-third increase from current levels, driven almost entirely by demand for AI hardware.
The worst offender isn't the processor itself. It's the memory chips stacked on top of it.
12x more emissions in six years
According to research from TechInsights, manufacturing emissions from AI GPU (graphics processing unit — the chips that power AI) production will rise from 1.8 million metric tons of CO2 in 2024 to a projected 21.6 million metric tons by 2030 — a 64.5% annual growth rate.
By 2029, a single AI chip will carry an average embodied carbon footprint of over 1 metric ton of CO2. That's nearly 7 times the footprint of an NVIDIA H100 (the chip that powers most current AI systems).
Key numbers at a glance:
Why memory chips are the hidden emissions driver
When people think about AI's environmental cost, they usually think about the electricity used to run data centres. But making the chips is becoming an equally serious problem.
The biggest culprit is HBM (High-Bandwidth Memory) — a special type of memory chip that AI processors need to work fast. These chips consume up to 5x more energy per gigabyte to manufacture than standard memory. And future AI chips will need roughly 250 memory dies each — six times more than current designs.
The manufacturing process involves fluorinated gases (potent greenhouse gases used to etch circuits), complex multi-die packaging (stacking many chips together), and silicon interposers (connective layers between chips) — each adding more emissions than simpler chip designs.
Who's responsible — and who's trying to fix it
The three major memory chip manufacturers — Samsung, SK hynix, and Micron — supply virtually all the HBM chips used in AI. Meanwhile, the demand comes from big tech companies building massive AI data centres: Alphabet, Amazon, Meta, and Microsoft.
Some progress is being made:
- SK hynix reduced its emissions intensity (pollution per gigabyte) by one-third between 2021 and 2024
- Micron has a target to cut Scope 1 emissions (direct factory pollution) by 42% by 2030
But these efficiency gains are being overwhelmed by sheer volume growth. Micron alone is increasing its capital spending by $5 billion to more than $25 billion in 2026 to meet AI demand.
What this means for AI's future
Every time you use ChatGPT, Claude, or Gemini, the response was generated on hardware that took enormous resources to build. As AI models get larger and more companies deploy them, the manufacturing footprint will keep growing.
This doesn't mean AI is "bad for the environment" — a well-designed AI system can save energy in other areas. But it does mean the industry needs to account for the full lifecycle cost of AI, not just the electricity bill of running it.
For companies making sustainability claims about their AI products, these numbers are a reality check. The carbon is baked into the silicon before the first query is ever processed.
Related Content — Get Started with Easy Claude Code | Free Learning Guides | More AI News
Stay updated on AI news
Simple explanations of the latest AI developments