2 of our biggest takeaways from Nvidia CEO Jensen Huang's GTC keynote speech

Nvidia CEO Jensen Huang’s keynote speech at the chipmaker’s annual developers event Monday included new product announcements and insight into where its revenue is headed. Here are our takeaways on two of Jensen’s biggest updates. New inference chip Jensen unveiled Nvidia’s new inference-focused chip, built on the technology it licensed late last year from AI chip startup Groq for a reported $20 billion. On Friday, we published an in-depth look into Groq’s origins and the growing competition that Nvidia faces in inference computing — that’s the name for the daily use of AI models after they’ve been trained. While Nvidia’s graphics processing units (GPUs) are dominant in training, AI computing is maturing and evolving in such a way that there’s a need for more specialized inference chips. That’s where Groq’s technology, which calls its chips language processing units, comes into play. It has a design that’s optimal for certain inference tasks where speed is of the utmost importance, usually called low latency. Nvidia is naming its Groq-infused processor an LPX, and notably, it is going to be available alongside the Vera Rubin generation of chips, which launch later this year (Vera is the CPU, Rubin is the GPU) to succeed the Blackwell family. The LPX is in volume production now at third-party manufacturer Samsung, Jensen said, available sometime in the “Q3 timeframe.” Nvidia is offering the inference chip in a server rack that contains 256 LPX processors. When we say rack, we’re talking about a cabinet-sized computer, containing both the “processor” engines and networking that stitches the chips together. A data center has rows upon rows of server racks. The idea isn’t that LPX racks entirely replace Nvidia’s GPU-plus-CPU servers for inference, Jensen said, but rather that they coexist within a data center, working together to improve performance. Jensen said the LPX won’t be necessary for every type of task. “If most of your workload is high-throughput, I would stick with just 100% Vera Rubin,” Jensen said. “If a lot of your workload wants to be coding and very high-valued engineering token generation, I would add Groq to it. I would add Groq to maybe 25% of my total data center,” the CEO said, with the remainder being Vera Rubin servers. He added, “That gives you a sense of how you would add Groq to Vera Rubin and extend its performance and extend its value even more.” Jensen also indicated that new-and-improved versions of the LPX will be coming in future years, cementing its presence in Nvidia’s broader roadmap featuring new CPUs (central processing units), GPUs, and networking technology. As part of its licensing deal, Nvidia hired key employees from Groq, including co-founder and now-former CEO Jonathan Ross. We think this inference chip is meaningful and helps Nvidia better fend off competition in inference from in-house chip initiatives such as Google’s tensor processing units (TPUs), which are co-designed by Broadcom , and other chip designers like Advanced Micro Devices . Visibility through 2027 Jensen said Nvidia expects orders for its Blackwell and Vera Rubin generation chips to total $1 trillion through 2027 — an updated look at demand in the coming years. To put this into context, we need to rewind for a bit. At an Nvidia conference last fall, Jensen said Nvidia had $500 billion worth of orders for its Blackwell and Rubin chips and related networking equipment through calendar 2026. Then, on Nvidia’s February earnings call, CFO Colette Kress indicated Nvidia was seeing upside to that “$500 billion Blackwell and Rubin revenue opportunity we shared last year. We believe we have inventory and supply commitments in place to address future demand, including shipments, extending into calendar 2027.” Now back to Monday afternoon. After Jensen offered the $1 trillion figure, Nvidia shares picked up steam and traded as high as $188.88, up about 4.8%. However, the pop would fade, and the stock ultimately closed at $183.22, up 1.65% for the session. Of course, it’s difficult to pinpoint exactly why the market does what it does sometimes. But in this instance, it seems possible that as traders and investors took a closer look at how that $1 trillion disclosure stacked up versus the Wall Street consensus and their own models, they determined it wasn’t as far above the consensus as it initially appeared. Our read on the situation: It seems likely that the 2027 consensus for Nvidia’s data center revenue will be revised higher in light of Jensen’s disclosure and, by extension, earnings estimates, too. But, at this point, the exact magnitude is not clear. Nvidia has a question-and-answer session with financial analysts scheduled for Tuesday, and that could help add clarity. Jim Cramer will also be interviewing Jensen on Tuesday on CNBC, and we may learn more during that conversation. Either way, Jensen’s comments should add conviction to the 2027 estimates — in other words, investors who are afraid the AI spending party may stop soon should feel better now that Jensen is discussing the company’s visibility through next year. The duration of the AI capital expenditure boom has been a debate around Nvidia’s stock for multiple years at this point, and there’s been no letting up yet. In a note to clients Monday morning, ahead of Jensen’s keynote, Morgan Stanley analysts addressed the uncertainty around the 2027 estimates — underscoring why Jensen’s fresh comments are helpful. Here’s what Morgan Stanley wrote: "As for the duration argument, the company has generally said all of the right things, and we have seen validation through the ecosystem that investment will be persistent, including hyperscale commentary from just last week; our view is that it is a matter of time before investors start to build comfort in the 2027 outlook. That does require a vibrant capital market, which is the primary risk, but we expect AI enthusiasm to remain high, and expect semiconductor constraints to keep investing from reaching excessive levels. What Jensen said Monday is exactly the kind of update that should help build comfort — even if it doesn’t happen overnight. (Jim Cramer’s Charitable Trust is long NVDA and AVGO. See here for a full list of the stocks.) As a subscriber to the CNBC Investing Club with Jim Cramer, you will receive a trade alert before Jim makes a trade. Jim waits 45 minutes after sending a trade alert before buying or selling a stock in his charitable trust’s portfolio. If Jim has talked about a stock on CNBC TV, he waits 72 hours after issuing the trade alert before executing the trade. THE ABOVE INVESTING CLUB INFORMATION IS SUBJECT TO OUR TERMS AND CONDITIONS AND PRIVACY POLICY , TOGETHER WITH OUR DISCLAIMER . NO FIDUCIARY OBLIGATION OR DUTY EXISTS, OR IS CREATED, BY VIRTUE OF YOUR RECEIPT OF ANY INFORMATION PROVIDED IN CONNECTION WITH THE INVESTING CLUB. NO SPECIFIC OUTCOME OR PROFIT IS GUARANTEED.

TNSR0.46%
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments