RockstarMarkets
All news
Markets · Narrative··Updated just now
Part of: AI Capex

AI Memory Shortage Persists; MSFT, GOOGL, NVDA CEOs Highlight Ongoing Constraints

Tech giants MSFT, GOOGL, NVDA, AMZN, and AAPL all flagged severe memory constraints on recent earnings calls, signaling prolonged AI infrastructure bottlenecks that could persist well into 2026. Market remains skeptical, pricing memory chipmaker MU at just 7x earnings despite supply tightness.

R
Rocky AI · RockstarMarkets desk
Synthesised from 8 wires · 49 mentions in the last 24h
Sentiment
+60
Momentum
70
Mentions · 24h
49
Articles · 24h
82
Affected sectors
Related markets

Key facts

  • MSFT, GOOGL, NVDA, AMZN, AAPL all cited memory constraints on May earnings calls
  • MU trading at 7x earnings despite AI memory bottleneck signals
  • Hon Hai reports stronger-than-expected profit on AI server demand

What's happening

The consensus among Silicon Valley's largest operators is crystallizing around a singular bottleneck: memory is not just tight today; it will remain constrained for months ahead. Within a two-day window in May, the CEOs of MSFT, GOOGL, NVDA, AMZN, and AAPL each made materially identical points on their earnings calls, all underscoring that memory availability is limiting their ability to scale AI infrastructure spending.

This coordinated messaging reflects a real physical constraint. HBM (high-bandwidth memory) production, which powers the most demanding AI accelerators, remains a chokepoint. While NVDA's Jensen Huang was busy in Beijing with Trump, his company's supply chain partners like Hon Hai are reporting stronger-than-expected profit from server assembly, affirming that demand for memory-intensive hardware shows no signs of abating.

The puzzle for traders is valuation disconnect. Despite years of AI capex euphoria and clear confirmation that memory scarcity will keep prices elevated, the market prices MU at a merely 7x P/E multiple, suggesting either deep skepticism about memory demand durability or hidden supply acceleration. Semiconductor equipment makers like AVGO and LRCX, which benefit from the need to expand memory fabs, are moving higher on the demand confirmation, but MU itself remains a relative laggard in the chipmaking narrative.

Sceptics point to the possibility that TSMC, Samsung, and SK Hynix are already ramping production more aggressively than CEOs' public statements suggest, and that the memory constraint narrative might be deployed tactically to manage customer expectations and reset pricing power. The risk is that if supply catches up faster than consensus believes, the entire AI capex multiplier effect could face a reset.

What to watch next

  • 01TSMC, Samsung Q2 guidance: memory capacity build-outs
  • 02MU earnings guidance for memory supply outlook: next call
  • 03AVGO, LRCX capex commentary: semiconductor equipment demand validation
Mention velocity · last 24 hours
Coverage from these sources
Previously on this story

Related coverage

More about $NVDA

Topic hub
AI Capex: Who's Spending, Who's Earning, and What's at Risk

Tracking AI infrastructure capex — hyperscaler spend, data center buildouts, memory demand and the margin compression risk.