Our fair value for Nvidia stock has just been raised from US$730 to US$910 after the company's latest tech showcase. Our analyst explains the reasons behind the increase.
Nvidia’s (NVDA) GTC conference keynote address introduced the company’s latest artificial intelligence graphics processor, or AI GPU, Blackwell, along with its associated platform and GPU clusters to enable AI workloads. Just as important, GTC showcased the efforts of the company’s wide range of partners in areas such as robotics and automotive, among many others. We don’t foresee these partners slowing their investments anytime soon, which supports our healthy growth estimates for Nvidia in the years ahead.
We raise our fair value estimate for wide-moat Nvidia to $910 per share from $730, as we’ve taken a fresh look at our model assumptions and we’re more optimistic about future industrywide capital expenditure, or capex, on AI GPUs in the decade ahead, both from large cloud computing vendors (that is, hyperscalers) and enterprises.
Morningstar Metrics for Nvidia Stock
• Fair Value Estimate: US$910 (from US$730)
• Morningstar Rating: ★★★
• Morningstar Economic Moat Rating: Wide
• Morningstar Uncertainty Rating: Very High
Hyperscaler capex should be meaningfully higher in 2024, with Nvidia receiving a windfall from such spending, and we now anticipate even higher industrywide data centre capex as more and more businesses invest in AI. We reiterate our Very High Morningstar Uncertainty Rating as we concede that these capex plans, AI workloads, and competitive dynamics are changing rapidly. We now view Nvidia’s shares as fairly valued.
Nvidia launched a barrage of news and partnerships at GTC – we count 42 press releases and/or blog posts on its website – but we think the larger trend is that a wide variety of businesses are reliant on Nvidia’s AI GPU computing power, with Blackwell extending the company’s lead. In turn, we think hyperscalers will continue to increase their total capex levels and disproportionately tilt their capex spending toward Nvidia’s AI GPUs.
These hyperscalers have the fortress balance sheets to support these investments in the years ahead and should reap the rewards of higher cloud computing revenue and earnings as these GPUs are deployed. We anticipate that enterprises will mirror higher capex as well.
Why Has Nvidia’s Stock Risen so Much?
Looking at cloud capex, we believe that much of the rise in Nvidia’s share price in 2024 – up 77% year to date versus 7% for the Morningstar US Market Index – and future earnings growth stems from hyperscaler and enterprise capex. These firms expect to spend incrementally more capex on AI GPUs in 2024, while also shifting the mix of “maintenance” capex away from traditional networking and server gear toward AI GPUs. Nvidia has captured a much larger piece of the data centre capex pie in recent quarters, and we anticipate that this mix shift will be the new normal for IT departments. Along these lines, we remain impressed with Nvidia’s ability to elbow into additional hardware, software, and networking products and platforms.
That said, AMD recently lifted its total addressable market, or TAM, estimates for the AI accelerator market (including memory chips) in 2027 to US$400 billion from US$150 billion. While we’re more optimistic about spending on AI accelerators in the years ahead, we still don’t foresee Nvidia and its peers reaching this US$400 billion threshold as soon as 2027. If AMD’s forecast is accurate and Nvidia can retain its AI GPU market leadership, then our revenue assumptions for the company might turn out to be conservative.
Stiff Competition in Artificial Intelligence
Despite our fair value estimate increase, we are keeping competitive pressures in mind. Nvidia’s hefty operating margins are everyone else’s opportunity, and every company in the AI ecosystem has incentive to be more efficient in AI spending and find alternative sources to Nvidia – either in-house or among other vendors. Still, we think Nvidia will retain the bulk of the AI workload business – certainly in AI training and perhaps a hefty piece of AI inference workloads, too.
Among the many announcements at GTC, we’re particularly intrigued by the launch of Nvidia Inference Microservices. NIM tools are built on top of Nvidia’s Cuda platform and will enable businesses to bring custom applications and pretrained AI models into production environments, which should aid these firms in bringing new AI products to market.
We view Nvidia as dominant in AI training workloads, but we think that one of the highlights of the firm’s blowout quarter in late February was its disclosure that 40% of its deployed GPUs are being used for AI inference. To the extent that Nvidia can emerge as the preferred industrywide solution for AI inference too, the company might be able to retain its dominant market share position (within an exponentially rising market) after all.
Nvidia’s New Hardware: Grace Blackwell Superchip
Overall, Nvidia’s hardware products announced at GTC appear impressive and will likely remain best-of-breed in the industry. Nvidia’s latest GPU, Blackwell, is built by Taiwan Semiconductor Manufacturing on its 4NP process and includes a whopping 208 billion transistors and is double the size of its prior-generation Hopper H100 GPU. The Grace Blackwell Superchip, GB200, connects two B200 Blackwell GPUs with an Arm-based Grace CPU. We’re still keeping an eye on Nvidia’s attach rate with Grace, as it may allow Nvidia to capture an even greater piece of the GPU server pie away from x86 CPUs from Intel or AMD.
Our new US$910 fair value estimate implies a fiscal 2025 (ending January 2025, thus relating to 11 months of calendar 2024) price/adjusted earnings multiple of 35 times and a fiscal 2026 multiple of 26 times. Our fiscal 2025 revenue estimates of US$116 billion in total and US$101 billion in data centre revenue for Nvidia are unchanged.
However, we lifted our longer-term data centre estimate for fiscal 2026 to US$135 billion from US$125 billion, and fiscal 2028 estimate to US$184 billion from US$141 billion, again driven by our revised estimates for a higher TAM based on greater industrywide data centre capex spending. Our revised annual assumptions for fiscal 2025, 2026, and 2027 revenue and adjusted EPS are above the mean FactSet consensus estimates before the event.