Ya’ll know this is going to be about Nvidia again right?

Our guest today: , is the lead writer of one of the most engaging Newsletters at the intersection of:

Technology

Macroeconomics

Culture

Share

😎 The Pragmatic Optimist 🤩

Amrita is the author of The Pragmatic Optimist, where she connects the dots ⚫ in macroeconomics, technology and culture to help her readers understand the “big picture”, identify great businesses and improve their financial and mental wellbeing.

Subscribe to TPO

Earlier in May, Amrita’s partner wrote this:

AI’s Bubble Talk Takes a Bite Out Of The Euphoria 🌊

Read It

AI Supremacy
AI’s Bubble Talk Takes a Bite Out Of The Euphoria 🌊
Hello Everyone, Audio Summary 🎧: 1:34 | Read Online I’ve been trying to evaluate the AI bubble for quite some time. There are multiple ways of evaluating it too. Even a Pragmatic Optimist can see the signs. Uttam Dey writes: “A growing number of investors and industry watchers are looking to the 1990’s dotcom bust and growing wary of the overcapitalized reach o…
Read more

I knew at that point I had to get a guest post from the other side of this awesome duo. Grateful that Amrita agreed. If you enjoy this post, you may also enjoy:

Links to Her Best Articles – TPO Hall of Fame 🙏

The Great Social Media Reset – We Scroll More Than We Post.

Risk a banking crisis or reignite inflation? The Fed’s worst dilemma is here.

“Don’t bet against America”- Will this statement hold true in this decade? (substack.com)

The internet is about to be flooded with AI Assistants. (substack.com)

Share AI Supremacy

By , June, 2024.

First things first. Let’s just all agree that Nvidia’s Jensen Huang is an official rockstar. 

No, I am not referring to the company’s March reveal of their latest Blackwell chip platform that the Wall Street Journal called ‘AI Woodstock.’ Neither am I referring to everybody’s “favorite investor”, Jim Cramer naming his dog Nvidia. Nor am I referring to Nvidia surpassing Apple as the second-highest valued company in the world, by market cap.

I am referring to this moment from the latest Computex event where Nvidia’s Huang autographed a woman’s top at the exhibition booth. 

Here’s a video of the moment in case you haven’t done a double take yet. If this doesn’t change your mind about Huang’s rockstar status, I don’t know what will. 

As the Thread above suggested, it feels like 2021 all over again. Except, it isn’t. Business activity & construction spending refuse to spring back up, interest rates remain elevated and key engines of U.S. consumer spending are losing steam all at once. Even the sentiment among software companies appears to be at one of their lowest levels, per Altimeter Capital’s Jamin Ball.

But that has not stopped Nvidia’s profit engine to continue to smash through the lofty expectations analysts and enthusiasts set for the company through last year.

Nvidia’s path to AI domination and a cloud computing powerhouse was decades in the making

If Nvidia’s Huang has one person to thank, it probably would be Andrew Ng, a highly respected AI researcher. Ng was also single-handedly responsible for pioneering the use of Nvidia’s GPUs to efficiently train AI models, as described in his path-breaking whitepaper in 2008

Between 2011 and 2015, Ng worked as the Chief Scientist at many large technology firms, Google included, where he deployed similar data center architectures based on Nvidia’s GPUs. During those years, Nvidia interestingly reorganized its business segments to reorient the company around the tailwinds it expected to see in then-emerging areas such as HPC (high-performance computing) and the data center. The company had already undergone a business reorganization in 2010 to take advantage of the advances in mobile computing led by the iPhone. But the datacenter seemed too lucrative to Huang, especially after the research industry led by Ng had started to use Nvidia’s GPUs as AI model training.

Here is a slide from their 2014 presentation to investors that showcased the strength in their HPC & Data Center that grew at a compounded growth rate of 64% between 2010 and 2014, far outpacing the growth pace of Nvidia’s other segments.

This period between 2010 and 2014 was one of the transitionally-defining moments that set Nvidia on the course to dominate AI as we see it today. One can note how these re-organizational strategies proved Nvidia’s Huang was correct in anticipating forward trends as can be seen in Nvidia’s revenue segments, with Data Center revenues growing from 12% of Total revenue in 2016 to 78% in 2023, growing at a compounded annual growth rate of 78% during this period of time.

Over time, Nvidia’s GPUs became extremely popular for unlocking efficiency gains at data centers. Nvidia’s ultimate calling came in 2023, as seen in the chart above, preceded by the launch of ChatGPT in late 2022, as the company became the biggest beneficiary of demand for model training-specific chips. Research by the equity team at Wells Fargo points to even further gains in the data center GPU market, where Nvidia captured a near 100% market share in 2023.

How Nvidia is trying to reposition for the future

If you haven’t noticed already, go back to the previous chart where I illustrated Nvidia’s near-100% share in the data center GPU market in 2023, and you’ll see the chart also includes projected market share for the next few years. Of course, these are all projections and estimations that must be taken with a grain of salt, but they do indicate that 2024 could be the first year that Huang’s Nvidia could cede some market share to AMD, another major chip company, expertly run by Huang’s cousin, Lisa Su. Intel is also expected to take back a minimal percent of the market share. How is that possible?

Both AMD and Intel have launched their own AI chip products for the datacenter, aimed at stealing back share from Nvidia’s H100/H200 data center-focused chips—Intel with their Gaudi3 AI accelerator and AMD with its MI300-series.

But Nvidia’s Huang is already looking ahead, yet again. He believes that the company must now turn its attention to the next leg of AI—model inference. Some industry experts believe that 20% of the demand for AI chips next year will be due to model inference needs. On a conference call with market analysts last month, Nvidia’s CFO revealed that model inference workloads accounted for 40% of Nvidia’s datacenter revenue, something that was virtually non-existent a year earlier. 

At the same time, Huang also revealed on the conference call that Nvidia will design & launch new chips annually, twice the pace at which it has rolled out chips in the past. The chip giant had traditionally produced new platforms every two years: Ampere in 2020, Hopper in 2022, and now Blackwell in 2024.

Huang has already mentioned on the conference call last month that the “Blackwell GPU architecture delivers up to 4x faster training and 30x faster inference than the H100”, while inference costs are expected to drop at least 3x. Then at Computex 2024, last week, Huang pre-announced Blackwell’s architectural platform successor, Rubin, scheduled for release in 2026. The rapid pace of development and product release coming out of Huang’s Nvidia is definitely targeted at anyone in the AI world having anything to do with model development and deployment, including its customers and competitors.

If you enjoying this post, you may also like:

Links to Her Best Articles – TPO Hall of Fame 🙏

The Great Social Media Reset – We Scroll More Than We Post.

Risk a banking crisis or reignite inflation? The Fed’s worst dilemma is here.

“Don’t bet against America”- Will this statement hold true in this decade? (substack.com)

The internet is about to be flooded with AI Assistants. (substack.com)

The incentive to move quickly on AI is clear. Capex is just one side of the story.

So far, companies from hyperscalers to AI startups and even global sovereign entities have responded in the form of a mad dash for Nvidia’s GPUs.

The four hyperscalers alone have pledged at least $200 billion in AI capex for building, training, and deploying GenAI models, according to the Economist. That is 45% higher than what was spent on AI capex in 2023. Both equity pricing and analyst forecasts reflect expectations for a material rise in AI-related investment from here, especially as we evolve from the early innings of model development to building and deploying applications that can add value across a wide range of industries and translate into economic output. 

“Why not?” argues Ray Dalio’s Bridgewater Associates, when these companies are sitting on one of the largest stockpiles of cash in history as well as sitting on this “once in a generation” opportunity called GenAI.

So, naturally, companies like Nvidia and others in the AI infrastructure ecosystem have and will continue to benefit from this boom in AI capex spending.

As can be seen in the chart below, the alpha returns, or the differential in returns to the S&P 500, have been the strongest in chip companies, with Nvidia leading the way. This was because the chip complex of semiconductor stocks saw the strongest growth in sales and earnings, which has led market analysts to continue hiking their expectations of the earnings per share that chip companies could deliver two years out. As it stands, analysts expect chip companies to 2x their earnings over the next two years.

How concerned should we be about “overbuild”?

An observation from the chart that I thought was quite telling was the alpha returns that the cloud sector has delivered since November 2022. Second to the ~60% returns that chip stocks delivered, the cloud complex of stocks appears to have delivered ~43% alpha returns over and above the S&P 500 since November 2022.

But unlike chip stocks, which have seen combined sales surge 50% since November 2022, cloud stocks have seen their revenues grow in paltry single digits at par with the S&P 500. Chip companies like Nvidia have seen their valuations soar on the stock market, justified by the surging revenue and earnings growth thanks to the increased spending by Nvidia’s cloud customers and hyperscalers. But with sales at most cloud companies yet to pick up, I wonder if these FOMO-like conditions, demonstrated by cloud companies’ capex spending splurges and the resulting valuation expansion for cloud companies, have reached any levels of “irrational exuberance.”

The truth is that outside of semiconductors and hyperscalers, AI revenue hasn’t yet shown up. 

Especially over the last two weeks or so, we have seen carnage in the software space, where companies such as Salesforce, MongoDB, and others, as represented by the iShares Expanded Tech-Software Sector ETF, have been severely reprimanded for not demonstrating the expected revenue growth, especially when these companies have been ramping up AI capex and buying GPUs and other hardware from chip companies.

While there are hopes of revenue reacceleration in the future, I believe it will be more tricky. With macroeconomic pressures sustaining and the ZIRP (zero interest rate policy) likely behind us, businesses will continue to consolidate their software vendors towards the ones that offer fully integrated platform solutions in order to gain pricing and efficiency advantages. This leaves other cloud players that have spent capex dollars on acquiring Nvidia’s chips with no sales to support the spending.

In closing…

I want to be clear: I am not bearish about AI based on what I’ve seen. I just think it’s wise to start being reasonably pragmatic about its expectations given the current state of the macro-economy. For lack of words, I will borrow thoughts from legendary investor Stanley Druckenmiller’s recent interview, where he believes that AI may be a “little overhyped in the short term”:

But, for now, Nvidia’s Huang is staying relentlessly focused on continuing to demonstrate its outperformance—launching new platforms and releasing faster chips while accruing sales and compounding earnings. With the capex dollar tap continuing to flow, the rockstar in Nvidia’s Huang will continue to drive the show. I leave you with this quote from Huang’s speech about buying Nvidia’s latest GPUs at Computex 2024:

“The more you buy, the more you save,” Huang said. “This is CEO’s math. It is not accurate, but it is correct!”

Biography

Amrita is the author of The Pragmatic Optimist, where she connects the dots ⚫ in macroeconomics, technology and culture to help her readers understand the “big picture”, identify great businesses and improve their financial and mental wellbeing. 

A global citizen, who speaks seven languages, she worked for several years in startups in the Bay Area where she successfully led growth programs to scale customer acquisition globally. In 2021, she decided to embark on a quest to study and disentable how the world’s financial markets and economies work after falling in love with Ray Dalio’s book, “The Principles for Dealing with the Changing Order”. 

Since then, she launched her family fund and her award-winning newsletter, where she breaks down complex financial and technological concepts and explores businesses that will enable or benefit from key technological innovations and cultural evolutions in the coming years.

⭕️⭕️_____/⭕️⭕️ If you liked this post, feel free to also check out: 👀

The Great Social Media Reset – We Scroll More Than We Post.

Risk a banking crisis or reignite inflation? The Fed’s worst dilemma is here.

“Don’t bet against America”- Will this statement hold true in this decade? (substack.com)

The internet is about to be flooded with AI Assistants. (substack.com)

More on Nvidia

🔥 Will Nvidia be overtaken by the new AI players?

NVIDIA’s chips are the tastiest AI can find. It’s stock still has ways to go.

Nvidia’s New A.I. Chip Times the Generative AI Gold Rush

Why Nvidia and not OpenAI is Generative AI’s Symbolic Company

References

Nvidia signing ThreadsVideo, Reddit,  

https://www.investors.com/news/technology/nvidia-ceo-jensen-huang-future-ai-chips/

Read More in  AI Supremacy