Nvidia Crushes Earnings as AI Demand Hits Another Gear
- Sara Montes de Oca
- 3 days ago
- 2 min read
Nvidia once again reset expectations for the AI era, delivering a blowout quarter that underscores just how dominant the company has become at the center of global compute demand.
The chip giant reported fiscal fourth-quarter results well ahead of Wall Street estimates, with revenue surging 73% year-over-year to $68.13 billion and adjusted earnings per share of $1.62, topping expectations. Shares moved higher in extended trading, continuing Nvidia’s outperformance relative to other megacap tech names in 2026.
But the headline number — and the real story — sits inside one segment.
Nvidia’s data center business, which powers AI workloads across hyperscalers and enterprises, grew 75% year-over-year to $62.3 billion, now accounting for over 90% of total revenue.
That concentration is staggering — and telling. AI is no longer an incremental growth driver. It is the business.
The company’s largest customers remain the hyperscalers — Alphabet, Amazon, Meta, and Microsoft — which collectively are expected to spend hundreds of billions this year building out AI infrastructure.
Nvidia sits directly in the flow of that capital.
Guidance Signals No Slowdown
If there were any lingering concerns about an AI slowdown, Nvidia’s guidance likely erased them.
The company forecast $78 billion in revenue for the next quarter, far ahead of expectations. Importantly, that outlook does not assume any contribution from China, highlighting just how strong demand remains even amid geopolitical constraints.
Net income nearly doubled year-over-year to $43 billion, reinforcing the company’s ability to convert AI demand into real profitability — a key differentiator as investors increasingly scrutinize fundamentals across the sector.
The Infrastructure Arms Race Intensifies
One of the more underappreciated signals in the report is how quickly the AI infrastructure stack is expanding beyond GPUs.
Nvidia’s networking business — critical for linking massive GPU clusters — grew 263% year-over-year, driven by adoption of its NVLink and Spectrum-X platforms. This reflects a broader shift toward rack-scale AI systems, where compute, networking, and software are increasingly bundled.
The company also confirmed progress on its next-generation Vera Rubin systems, which promise up to 10x performance per watt, a crucial metric as data centers hit power and cooling limits.
At the same time, Nvidia is aggressively reshaping its supply chain — expanding manufacturing into the U.S. and Latin America — to keep pace with demand and reduce geographic concentration risk.
Despite the strength, there are early signs of tension beneath the surface of the AI boom.
Memory shortages are beginning to impact parts of the business, particularly gaming, where Nvidia signaled potential headwinds as resources are reallocated toward higher-margin AI chips.
This dynamic is becoming a recurring theme across the industry:AI demand is so strong that it is crowding out other segments.
The Bigger Picture
Nvidia’s results reinforce a reality the market is still digesting: This isn’t a typical tech cycle — it’s an infrastructure supercycle.
The combination of hyperscaler spending, enterprise adoption, and emerging use cases is creating a feedback loop where demand begets more demand. Nvidia is not just participating in that cycle — it is defining it.
The key question now isn’t whether AI demand is real.
It’s whether anything — supply chains, power constraints, or capital discipline — can slow it down.