RockstarMarkets
All news
Markets · Narrative··Updated 1h ago
Part of: AI Capex

Nvidia Approved for H200 China Sales; Stock Rallies 20% in Seven Days to $6T Market Cap

US government greenlit Nvidia's H200 chip sales to 10 Chinese companies as trade talks ease. NVDA gained 20% in seven days, nearing $6 trillion market value on AI buildout conviction and Trump Beijing summit tailwinds. Broadcom, AMD also benefiting from AI capex cycle momentum.

R
Rocky · RockstarMarkets desk
Synthesised from 8 wires · 40 mentions in the last 24h
Sentiment
+75
Momentum
85
Mentions · 24h
40
Articles · 24h
107
Affected sectors
Related markets

Key facts

  • US government approved Nvidia H200 sales to 10 Chinese companies via Trump-Xi summit
  • NVDA rallied 20% in seven days, reaching near $6 trillion market capitalization
  • Cisco reported strong AI networking demand; CoreWeave signals inference capex expansion
  • Meta locked in $21B multi-year CoreWeave agreement for inference capacity
  • Broadcom and AMD gaining on AI buildout breadth beyond pure GPU demand

What's happening

Nvidia's stock surged 20% over seven days, carrying it to within striking distance of $6 trillion market capitalization, after the US government approved export sales of its H200 AI accelerator chips to ten Chinese companies. The approval is a nuanced win: it signals that Washington is willing to permit commercial AI infrastructure exports provided they serve non-military applications, while maintaining restrictions on the most advanced military-grade processors. For Nvidia, this translates into preserving access to one of the world's largest AI markets without triggering additional geopolitical friction that might invite retaliatory sanctions.

The catalyst for the surge extends beyond the chip approval itself. Trump's Beijing summit with Xi Jinping has reduced geopolitical uncertainty around US-China trade relations, lifting risk appetite across technology and semiconductor stocks. Investors are repricing the probability of sustained AI capex spending across both US and Chinese enterprises, with less tail risk of sudden policy reversals. On the same day the H200 approval was announced, Cisco reported strong AI networking demand, indicating that the AI buildout is accelerating beyond GPUs into entire networking and datacenter infrastructure stacks. This breadth of demand strengthens the narrative that AI spending is structural, not cyclical.

Broadcom and AMD have also benefited from the momentum, with traders positioning ahead of Nvidia earnings and broader AI capex visibility. Cisco's own rallies suggest that customers are willing to spend on the full stack: chips, networking, storage, and inference hardware. Meta's $21 billion CoreWeave agreement underscores that inference capacity, the steady-state operation of trained models, is becoming a larger capex bucket than pure training. This shift is important because it extends the AI capex cycle beyond the initial GPU gold rush, potentially sustaining demand across multiple semiconductor suppliers.

The risk to the narrative is valuation: at near $6 trillion, Nvidia is pricing in flawless execution on H200 ramps, sustained Chinese demand post-trade-deal, and no major shifts in data center purchasing behavior. Analysts tracking AI capex maturation cycles warn that if spending growth rates decelerate from 50% YoY to 20% YoY within the next two quarters, multiples could correct sharply. For now, market participants are riding momentum, but the burden of proof on earnings growth is rising.

What to watch next

  • 01Nvidia earnings call for H200 order book guidance: next quarter
  • 02Meta, Google capex guidance on inference spending ramps: next earnings
  • 03China AI demand validation via server order flows: next 30 days
Mention velocity · last 24 hours
Coverage from these sources
Previously on this story

Related coverage

More about $NVDA

Topic hub
AI Capex: Who's Spending, Who's Earning, and What's at Risk

Tracking AI infrastructure capex — hyperscaler spend, data center buildouts, memory demand and the margin compression risk.