How Nvidia's Q4 earnings set the stage for further dominance

In this article:

Shares of Nvidia (NVDA) rose after market close on Wednesday after the company posted its fourth-quarter earnings, posting $22.10 billion in revenue versus an expected $20.62 billion. The company also posted adjusted earnings per share of $5.16 versus an expected $4.64 per share.

Antoine Chkaiban, New Street Research Technology Infrastructure Analyst, and Dan Morgan, Synovus Trust Senior Portfolio Manager, joins Yahoo Finance to discuss the results.

Chkaiban elaborates on the possibilities for Nvidia in the future, along with AI chip making as a whole: "Global spending on data center AI chips tripled in 2023 from $15 billion to $45 billion and is now set to double again to $90 billion in 2024. And if we look out a few years, well late last year, Lisa Su CEO of AMD (AMD) pitched a target addressable market for data center AI chips of $400 billion in 2027. So now we just need to think about the future along two axis. Either it materializes in a healthy and sustainable way or we fall short of this prediction, with disappointments, over investments, and lower returns."

Speaking to Nvidia's dominance in the space, against their competition, Morgan says: "I think we have to think of Nvidia as first to market in AI. They have been talking on their conference calls and their product calls for over five years about the fact that they are dominant in the space. And then it came to fruition about a year ago, on this quarter, when they had the big blowout quarter." He continues by mentioning one the reasons why Nvidia is ahead of competition: "The average AI chip that Nvidia sells, we don't know the exact price, some of them are like $20,000 or $30,000 apiece. So you kind of wonder if like an AMD who's kind of known as being a lower-priced chipmaker, especially in the CPU area against Intel (INTC), or let's say the data service space, would they come to try to introduce a cheaper chip that maybe people would be willing to go with."

For more expert insight and the latest market action, click here to watch this full episode of Yahoo Finance Live.

Editor's note: This article was written by Nicholas Jacobino

Video Transcript

JULIE HYMAN: Let's talk to one analyst who's watching these numbers closely. That is Antoine Chkaiban, who is New Street Research Technology Infrastructure Analyst. Antoine, what do you make of these numbers as we just get them? It looks like a beat here but not the astonishing beats perhaps that we have had in some recent quarters.

ANTOINE CHKAIBAN: Yes. Thanks, Julie, for having me. So Nvidia just printed 4Q topline, up 22% sequentially. That's 8% above sell side consensus expectations on top line, with data center revenues 27% above, and guided 1Q topline 9% sequentially, 8% above consensus expectations. And the stock has been moving a lot after market.

This clearly illustrates that buy side expectations were above sell side. And also, remember that this has to be put in context of a very strong acceleration in the past couple of quarters. Nvidia's topline had already grown 2 and 1/2x in 3Q compared to 1Q, as companies like Microsoft and Meta accelerate the deployment of their AI infrastructure.

JOSH LIPTON: And, Antoine, I'm curious on the call here coming up with Jensen Huang as well as, of course, the CFO, what's your number one question, Antoine, for those executives?

ANTOINE CHKAIBAN: So I think just looking at the numbers in the press release, indeed, doesn't give us any indication on where we're headed. We're still very early in the broader deployment of AI. Demand is very strong at the moment. Supply is tight. And hopefully, Nvidia will give qualitative color on the call.

The company made comments last quarter suggesting that visibility now extends in 2025 with inference picking up. That's the kind of color that we'll be on the lookout for.

JULIE HYMAN: And I'm also curious what kind of trajectory you're looking for for the generative AI cycle, I guess, even though it feels still like we're very early. There was another analyst who talked about a replacement cycle of these GPU-- these Nvidia-provided GPU chips that could be as much a high percentage as 50% to 60% of chips from, I guess, around 10% right now. Does that-- I mean, you know, what are the estimates out there for how big this could eventually be?

ANTOINE CHKAIBAN: But if we take a step a step back on all this, global spending on data center AI chips tripled in 2023 from $15 billion to $45 billion and is now set to double again to $90 billion in 2024. And if we look out a few years, well late last year, Lisa Su, CEO of AMD, pitched a target addressable market for data center AI chips of 400 billion in 2027.

And so now we just need to think about the future along two axes, either it materializes in a healthy and sustainable way or we fall short of this prediction with disappointments over investments and lower returns.

And we really see three possible scenarios from here. First of all, if AMD's $400 billion in 2027 case materializes, well, Nvidia is still amongst the highest quality names to invest today. $400 billion spent on AI chips likely means at least $300 billion on GPUs. And Nvidia, the undisputable leader in this market, could generate $65 of EPS in 2027. That's more than two times above consensus. And that makes plenty of room for the stock to keep trending up.

And then another scenario could be for AI to cool off and say only half of AMD's projection materializes. $200 billion spent on chips in 2027 means maybe $150 billion on GPUs. And Nvidia would generate closer to $30 of EPS, which is what consensus is currently modeling. Such a scenario would therefore make less room for upside as it's already priced into the stock.

And lastly, if AI adoption plays out much more slowly than what people currently expect, well, we could simply reach a peak in 2025 and realize that use cases are not mature enough to drive adoption. Spending would pull back in 2026, 2027, as the industry digests the chips deployed. And that is what we call an AI winter. And it's in such a scenario Nvidia would miss expectations and the stock would pull back.

JOSH LIPTON: Antoine, stay right there. I want to bring in another-- another very smart analyst to give his commentary. Dan Morgan, Synovus Portfolio Manager, joins us now. Dan, it is great to have you in the conversation. We should mention, Dan, initially this stock in the after hours had dropped about 4%. You are now popping almost now above 6.5%. So investors digesting this and now initially at least seem to be responding very positively. Dan, want to get first your reaction and your response, Dan, to this report.

DAN MORGAN: Well, Josh to me it looks like a beat across the board. I mean, the big numbers I was focusing on was data center in terms of revenues. They beat that handily. And then of, course, that upcoming first quarter of '25 revenue number, I think the Street was a little over $21 billion, that came in over $24 billion. So again, it really-- there is very hard to find any flaws in this report.

You know, Julie, Josh, coming into this I was worried that maybe even if they beat on the fourth quarter and guided ahead on the first quarter of '25, it still wouldn't be enough. So I'm encouraged that they are reacting positively to the stock. We had a big sell off, as you know, coming to this report. A lot of it is related to the Palo Alto Networks warning that we got last night that kind of rolled into the Nvidia number and Nvidia stock price. But it appears that we're getting some clear thought here on going forward with Nvidia. And it looks all positive.

JULIE HYMAN: Just one thing I wanted to draw our viewers attention to that I've noticed in the CFO commentary from Colette Kress over at Nvidia, she mentioned that data center sales to China declined significantly in the fourth quarter due to US government licensing requirements. Now, that's not a surprise, right? We know that Nvidia is trying to figure out some workarounds for China.

But, Antoine, I'll take this question back to you. How much of an obstacle still is China? And do you expect China to continue to? Will it get worse?

ANTOINE CHKAIBAN: So I think in the guidance actually for the quarter that Nvidia just reported, there were no data center revenues. Nvidia guided for very minimal data center revenues this quarter. And the way we see things playing out going forward is that demand is so strong everywhere in the world that chips that would have been allocated to China can definitely be reallocated to other countries to demand elsewhere. So we don't really see that weighting much on data center growth going forward.

JOSH LIPTON: And, Dan, I want to bring you back here as well. I should mention, Dan, we think of Nvidia now as kind of an AI proxy, so maybe it's no surprise here, Dan. I'm looking at a few different names here moving higher, maybe in symphony, Supermicro, AMD, ARM, Palantir moving higher here in the after hours as well.

One question I had for you, Dan, was on the question of competition, you know, in Gen AI Silicone. When you think of competition-- I guess I'll double barrel for you, Dan. One, what are, in your opinion, Nvidia's kind of competitive advantages? And two, who you think poses the greatest competitive risk? Is it AMD, Intel, or another player?

DAN MORGAN: Yeah, Josh. So I think we have to think of Nvidia as kind of first to market in AI. I mean, they have been talking about on their conference calls and their product calls for over five years about the fact that they are dominant in this space. And then all of a sudden, it came to fruition about a year ago on this quarter. And they had the big blowout quarter. So they've got that lead post in terms of ahead of a lot of the competition.

But you're right, Josh. I mean, I've done a lot of work in putting together tables and so forth of what are some of these competing chips. I mean, you have AMD which you mentioned. They have the MI300X, MI300A, which they have, rolling out right now in that space. You also have Intel with Falcon Shores, GOTY 3, which is another chip that's going to be coming out the next year or two.

You look at Marvel Technologies. Their PAM4 DSP processor is also in that space. But there's no doubt that NVIDIA has that lead ahead of a lot of these other companies who are just now starting to roll out competing AI chips. So because of that, they're-- probably stay in the lead.

But think about this, Josh. The average AI chip that NVIDIA sells-- we don't know the exact price. But some of them are like, $20,000 or $30,000 a piece. So you kind of wonder if like an AMD, who's kind of known as being a lower priced chip maker, especially in the CPU area against Intel, or let's say in the data server space, would they come in to try to introduce a cheaper chip that maybe people would be willing to go with? So that's where I kind of see the door maybe opening up for maybe an AMD or some other competing company, like a Marvel or something like that.

JULIE HYMAN: Yeah. I mean, that's an interesting question. I mean, Qualcomm has already also come out with some chips that it says are very speedy in the AI realm for smartphones. And I wonder-- part of what NVIDIA has done was, to some extent, commoditize what has been historically, at least, a commodity business, right, when you're talking about semiconductors and still is that way in many sectors of the business. Antoine, do we get to a point where even these AI chips are commoditized, where you see a price war that gets kicked off by an NVIDIA competitor, for example?

ANTOINE CHKAIBAN: I think creating, designing an AI chip is easier said than done. It requires so much R&D, so much experience to create a chip that's as flexible as an NVIDIA GPU today where you have compute units speaking to a hierarchy of memory that is so complicated so you can move data around without keeping parts of the chips that-- a chip that is very expensive, that costs $20,000, if not more, keeping all the parts of the chip busy at every point in time. It takes a lot of experience. It takes a lot of R&D.

The ecosystem that NVIDIA has built, as well, all the software stack, all the ecosystem resources, the developers that can help each other-- that is not something that can be created overnight by a competing ecosystem, by another company. So I really don't think that we're headed towards a commodity market here. On the contrary, if anything, ecosystem dynamics will drive concentration.

If we go back to this $400 billion time that AMD alluded to, we see GPUs accounting for at least 75% of that market because of the flexibility that they offer. And then we see NVIDIA remaining dominant within that $300 billion market if it materializes.

JULIE HYMAN: And I think Antoine brings up a good point, Dan, as well, about the sort of-- it's not just the chip that NVIDIA makes. It has the software. It has that-- what he called the ecosystem. So then do you play it not just by trying to find out who the competitors are going to be, but who are the service providers for NVIDIA, who is working with them to help them produce that ecosystem?

DAN MORGAN: Yeah. I think what Antoine's alluding to is, Julie, is that NVIDIA has their own proprietary software that they sell along with their chips. Kind of going back to maybe issues in terms of possible threats that we could look at from an NVIDIA, especially the data center, it's projected about 65% of their revenue comes from data center AI chips.

We know that-- companies just recently announced in the last couple of days. We know that Amazon, we know that Microsoft, we know that Alphabet, Meta are all working on their own AI chips to use in their own systems, which would take away that reliance that they would have on, let's say, an NVIDIA and maybe their willingness to spend $20,000 on a chip. So that's another interesting thing, Julie, to think about.

We think about the competition from AMD and Intel and Broadcom and Marvel. But it could be from the actual buyers that use the chip. They start developing their own interior proprietary chip, kind of like what Apple's done with the Mac and has moved over with their phones. They have their own chips now opposed to relying on outside vendors. So that's kind of something interesting to think about.

JOSH LIPTON: Antoine, I'll get you out of here on this. I see the stock-- it is surging here in the after hours, up over 6%. One question I had for you, Antoine, is as you think about as a financial analyst, when you think about NVIDIA in the quarters, years ahead, is it going to be a kind of and bust company, Antoine, or No you think the kind of growth we're seeing here, which is obviously really impressing investors in the Street-- it can actually continue?

ANTOINE CHKAIBAN: Well, we have been very, very busy over the last few months thinking about that question and putting together a framework to help us predict towards which scenario we're headed. You remember I just mentioned we could be headed towards a $400 billion scenario. We could be headed towards an AI winter. And I think we'll be monitoring four parameters to determine where we're headed.

First of all, we'll be monitoring how quickly model complexity is increasing. GPT-4 is likely five times larger than GPT-3. And we have been on this trend for many years now that's driving demand for more compute.

Then you have cheap efficiency that's increasing, as well, of course. H100's throughput is six times higher than A100. And Moore's law and design improvements will keep driving higher throughput. That is offsetting model complexity increasing.

And then you have usage, of course, users and the time that each user spends on ChatGPT, Gemini, et cetera. That is driving demand for more chips. And so this is what we'll be monitoring in coming years. This is how we can add value for investors, by carefully tracking how these three parameters-- model complexity, chip efficiency, and usage-- are growing. We can determine how much the AI install base needs to grow and how much the world will have to spend to build it. Those are really the leading indicators to determine if we're headed towards an AI winter or not.

JULIE HYMAN: Dan and Antoine, I have one final question that I want to ask both of you. Dan, I'll start with you. Right now, the stock is up. Obviously, it could do whatever in the next 24 hours. If it continues to rise, is it too late for investors, Dan, to get in here?

DAN MORGAN: Julie, we've had NVIDIA on our buy list for a pretty long period of time, way before it became the hottest stock ever. And for investors that would possibly come in with new monies, it's on their buy list if the risk tolerance and the objective matches.

The stock trades at about 83 times trailing earnings, trades about 35 times fiscal year 2025 earnings. So it's not excessive, Julie, when we look at the forward multiple. And I still think there's room to grow. I mean, I'm going to write a piece of our upcoming newsletter this month that talks about is NVIDIA the next Microsoft and Apple, meaning those two stocks have eclipsed $3 trillion. Microsoft just got over the fence.

Could NVIDIA be the next $3 trillion stock-- so still very optimistic and very bullish. And I still think there's plenty of opportunity for investors. Obviously, try to catch it on a weaker day. But it's still-- fundamentals are still intact going forward. There's really nothing, I believe, to derail the story at this point.

JULIE HYMAN: Antoine, do you agree?

ANTOINE CHKAIBAN: So I think I agree that there is still upside. I definitely think that there is still room for the stock to keep going up. And so I join Dan on this one.

But I also think that there is probably more upside in other names, like AMD, for example. AMD probably has the most attractive profile, actually, with the most upside if this $400 billion time plays out because the market is going to grow so much. And AMD's revenues today in that time is-- are very small. And so there's plenty of room for AMD to beat expectations even more than NVIDIA.

I also think other names, like TSMC, Intel, and Micron, probably offer less upside than AMD. But they're also a better way to play this with a safer approach. Even if the high case doesn't materialize, there is still upside in an AI winter scenario.

So we look really at this space more holistically. And we think NVIDIA is one of the ways. But there are many other ways to play this trend.

JULIE HYMAN: Gentlemen, thank you so much-- really great to have you on and get your insight instantly after these numbers. Dan Morgan and Antoine Chkaiban, thank you so much.

Advertisement