top of page

Nvidia's AI Chip Dominance: Soaring Revenues and Strategic Partnerships Propel Growth

At the start of last week, OpenAI’s technology chief personally thanked Nvidia CEO Jensen Huang for providing the advanced chips necessary for a demo in a presentation on OpenAI's latest artificial intelligence models.


A day later, at Google’s annual developer conference, Alphabet CEO Sundar Pichai emphasized his company’s long-standing partnership with Nvidia and noted that Google Cloud will utilize the chipmaker’s Blackwell graphics processing units (GPUs) in early 2025.


This week, Microsoft, which supplies servers to OpenAI, will announce new AI advancements and features developed on the company’s massive clusters of Nvidia GPUs. The announcement will be part of Microsoft's Build conference in Redmond, Washington.


Heading into its quarterly earnings report on Wednesday, Nvidia is at the forefront of technological advancements. The 31-year-old company, whose market cap has surpassed $2 trillion this year, is expected to report year-over-year revenue growth exceeding 200% for the third consecutive quarter. Analysts project a fiscal first-quarter increase of 243% to $24.6 billion, with more than $21 billion expected from Nvidia’s data center business, which includes advanced processors sold to Google, Microsoft, Meta, Amazon, OpenAI, and others.


Nvidia is generating substantial profits from its AI product suite, with net income expected to increase more than fivefold from a year earlier to $13.9 billion. The stock has surged 91% this year after more than tripling in 2023.


Dan Niles, founder of Niles Investment Management, compared Nvidia’s position in the AI boom to the "internet buildout" of the 1990s and Cisco's central role during that time. He noted that despite several dramatic pullbacks over a three-year period, Cisco ultimately increased by 4,000% up to its peak in 2000. He predicts Nvidia will experience similar cycles.


"We’re still really early in the AI build," Niles said on CNBC’s “Money Matters” on Monday. "I think the revenue will go up three to four times from current levels over the next three to four years, and I think the stock goes with it."


Google, Amazon, Microsoft, Meta, and Apple are expected to spend a combined $200 billion on capital expenditures this year, with a significant portion allocated to AI-specific infrastructure like Nvidia chips.


OpenAI is relying on Nvidia’s technology for its latest chatbot, GPT-4. Meta announced plans in March to buy and build out computers that will include 350,000 Nvidia GPUs, costing billions of dollars. CEO Mark Zuckerberg even swapped jackets with Huang and posed for a picture with the Nvidia CEO.


"If you look at today for the AI build out, who’s really driving that?" Niles said. "It’s the most profitable companies on the planet — it’s Microsoft, it’s Google, it’s Meta, and they’re driving this."


Before the recent AI boom, Nvidia was primarily known for making chips used in 3D gaming. About a year ago, the company hinted at historic growth, indicating it would generate about 50% more in sales than analysts expected in the July 2023 quarter.


Growth rates have since accelerated, but starting in the second quarter, expansion is expected to slow, with analysts anticipating significant deceleration in each of the next three periods.


"We just don’t know how long this investment cycle lasts and just how much excess capacity will be created over that time in case this AI thing doesn’t materialize as quickly as expected," Bernstein analysts wrote earlier this month.


Despite potential challenges, Nvidia is not at risk of losing a significant portion of the AI chip business to rivals. Piper Sandler analysts expect Nvidia to retain at least 75% of the AI accelerator market, even as companies like Google develop custom chips.


"We view the percentage of hyperscaler spend that is dedicated towards compute further rising in 2024 and 2025," Piper Sandler analyst Harsh Kumar wrote.


A key question for Nvidia is how well the transition to its next generation of AI chips, called Blackwell, will go. Some worry there could be a lull as clients delay purchasing older Hopper GPUs like the H100 in favor of Blackwell-based chips such as the GH200.


"To some degree, the setup has shifted," wrote Morgan Stanley analyst Joseph Moore on Monday. "Six months ago, short term expectations were very strong but there was anxiety about durability. Now, fresh on the back of hyperscalers talking up longer-term spending expectations for AI, those longer-term views are more positive, but there is anxiety about a pause in front of Blackwell."


Comments


Commenting has been turned off.
bottom of page