RockstarMarkets
All news
Markets · Narrative··Updated 1h ago
Part of: AI Capex

AI Memory Crisis: MSFT, AMZN, GOOGL Earnings Warn Supply Cannot Keep Up

CEOs of Microsoft, Meta, Google, Amazon and Apple all highlighted on recent earnings calls that memory (DRAM and HBM) is severely constrained and will remain bottlenecked through 2026. Market prices Micron (MU) at only 7x earnings despite being the core beneficiary of this scarcity.

R
Rocky · RockstarMarkets desk
Synthesised from 8 wires · 49 mentions in the last 24h
Sentiment
+50
Momentum
70
Mentions · 24h
49
Articles · 24h
40
Affected sectors
Related markets

Key facts

  • CEOs of MSFT, META, GOOGL, AMZN, AAPL cited memory constraints within 2 days in May
  • Memory shortages expected to persist through 2026 per executive guidance
  • Micron (MU) trades at 7x earnings despite being primary US-listed DRAM/HBM producer
  • HBM and DRAM are critical bottlenecks for AI accelerator deployment
  • Samsung, SK Hynix, Micron control supply; no near-term capacity relief signaled

What's happening

Within a two-day span in early May, the chief executives of five of the world's largest tech firms sent the same message on their earnings calls: memory is becoming the critical constraint on AI buildout, and relief is not imminent. MSFT CEO Satya Nadella, META CEO Mark Zuckerberg, GOOGL CEO Sundar Pichai, AMZN CEO Andy Jassy, and AAPL CEO Tim Cook each referenced memory shortages explicitly, and in some cases warned that memory costs are inflating faster than they'd anticipated.

The specifics are revealing. AI infrastructure requires two distinct types of high-bandwidth memory: traditional DRAM (used in CPUs and general compute) and specialized HBM (high-bandwidth memory) stacks mounted on AI accelerators like NVIDIA's GPUs. Both are produced by a handful of suppliers, chiefly Samsung, SK Hynix, and Micron. Demand from cloud hyperscalers building out inference infrastructure has far outpaced production capacity.

Micron (MU) is the only US-listed pure-play DRAM and HBM manufacturer with meaningful capacity. Yet the stock trades at roughly 7x trailing earnings, a valuation that the tape suggests implies either consensus skepticism about durability of AI capex or structural overcapacity concerns. By contrast, NVIDIA, which is memory-constrained but not memory-producing, trades at a premium to Micron despite facing supply competition from the very memory suppliers who control Micron's upside.

The narrative tension is significant: if AI capex continues (and all five CEOs signaled it will), memory is the limiting reagent. Micron should be priced like a gatekeeper. Yet short interest and analyst downgrades suggest the street believes memory supply will normalize faster than executives are guiding, or that hyperscalers will eventually shift to custom in-house memory (as they have with chips). The arbitrage between memory demand signals in mega-cap earnings and Micron's depressed multiple is one of the year's most glaring disconnects.

What to watch next

  • 01Micron Q3 2026 earnings guidance on memory demand: late June
  • 02Samsung, SK Hynix production updates: next quarter
  • 03Hyperscaler capex commentary in next earnings cycle: June-August
Mention velocity · last 24 hours
Coverage from these sources
Previously on this story

Related coverage

More about $MSFT

Topic hub
AI Capex: Who's Spending, Who's Earning, and What's at Risk

Tracking AI infrastructure capex — hyperscaler spend, data center buildouts, memory demand and the margin compression risk.