Are Nvidia Investors Ignoring This Big Risk?
Nvidia (NASDAQ:NVDA) has been the undisputed leader of the AI boom. Its GPUs are the gold standard for training large AI models and this has driven up sales from $27 billion in FY’23 to a projected $200 billion this fiscal year. Beyond selling the highest performance chips, the company’s CUDA software ecosystem has helped to better lock in customers. While Nvidia is likely to remain the king of the AI hardware hill for years to come, its stock valuation of close to 40x forward earnings reflects not just leadership but expectations of sustained, multi-year growth.
That makes it vulnerable: even a modest slowdown in demand or a structural shift in the AI lifecycle – from training-focused workloads toward inference – could hurt investor confidence and trigger sharp declines. History underscores this risk – after the Covid-era GPU surge in gaming and crypto gave way to inflation and softer demand, Nvidia shares fell nearly 66% peak-to-trough, versus just 25% for the S&P 500. The stock’s volatility means a similar drawdown is possible if the current phase of AI growth cools. Read NVDA Dip Buyer Analyses to see how the stock has recovered from sharp dips in the past.

Image by Jacek Abramowicz from Pixabay
Training vs. Inference Shift
Companies have devoted immense resources to building AI models over the last two years. Training these massive models is typically a concentrated effort that demands enormous computing power, and Nvidia has been the biggest beneficiary, with its GPUs widely regarded as the fastest and most efficient for these tasks.
However, the AI landscape may be shifting. Incremental performance gains are diminishing as models grow ever larger, while the availability of high-quality training data is becoming a bottleneck – much of what’s easily accessible online has already been absorbed into today’s models. Taken together, these dynamics suggest that the most intensive phase of AI training could begin to level off. Adding to the uncertainty, the economics of the GPU market remain challenging, as many of Nvidia’s customers are still struggling to generate meaningful returns on their heavy AI investments.
Inference, by contrast, is about applying trained models to new data in real time, at scale. It is lighter per task but happens continuously across millions of users and applications. As AI matures, more of the value creation could shift from training toward inference. The challenge for Nvidia is that its growth has been tied primarily to training, where its high-end GPUs dominate. Inference opens the door for more mid-performance and value chip alternatives, as well as specialized offerings. Here’s a look at some of the key challengers in the inference space.
What The Inferencing Landscape Looks Like
AMD has considerably lagged Nvidia in the early stage of the AI build out, but it could emerge as a key challenger to Nvidia in inference. Its chips are increasingly competitive in performance while offering cost and memory advantages. Not all organizations need or can afford Nvidia’s top-tier GPUs. Many are likely to opt for older Nvidia chips or cost-effective alternatives like AMD’s (NASDAQ:AMD) MI series, which offer solid performance for inference and fine-tuning of models. See How AMD stock surges to $330.
ASICs or Application-Specific Integrated Circuits are also gaining traction. Unlike GPUs, which are versatile and programmable, ASICs are optimized for a single task, making them more cost- and power-efficient for inference workloads. The crypto industry offers a precedent: Bitcoin mining started with GPUs, but quickly migrated to ASICs once scale and efficiency became critical. A similar pattern could play out in AI inference. Two companies that could benefit from this trend are Marvell and Broadcom, both of which have experience in building custom silicon for hyperscalers.
U.S. Big Tech players such as Amazon (NASDAQ:AMZN), Alphabet, and Meta are all designing AI chips. Amazon has leaned into training-focused chips, Meta started with inference and is expanding to training, while Google supports both through its TPU (tensor processing unit) lineup. For these hyperscalers, the motivation is not necessarily to beat Nvidia in the market, but to lower costs and improve bargaining power, and control supply for their massive cloud ecosystems. Over time, that means less incremental demand for Nvidia’s GPUs. Over Q2, Nvidia said that just two of its customers made up roughly 39% of total revenue, and it’s very likely that these were big U.S. tech companies. This concentration makes Nvidia so much more vulnerable. If hyperscalers increasingly use in-house silicon, even modest shifts in buying behavior could have outsized revenue impact.
Chinese Players In China, companies like Alibaba (NYSE:BABA), Baidu, and Huawei are stepping up their AI chip efforts. Reports last week suggested that Alibaba plans to launch a new inference chip for its cloud division. The strategy is twofold: to power inference at scale using its own tech stack, and to secure a steady supply of semiconductors amid U.S. export restrictions. For now, Nvidia’s GPUs are still expected to form the backbone of Alibaba’s AI training workloads, but inference is likely to become the longer-term growth driver for the company.
Nvidia Still Leads, but Risks Are Mounting
Nvidia’s position remains strong, thanks to its entrenched ecosystem, deep R&D, and dominance in training. But inference is likely to be the next growth engine for AI hardware, and the competitive field is far more crowded. Even a slight slowdown in growth could weigh heavily on the stock, given how much future performance is already priced in. For investors, the key question is whether Nvidia’s growth trajectory can keep up with the lofty expectations the market has set. If inference economics proves less favorable than training, the stock could still see a “valuation reset” even without any loss of technological leadership.
The Trefis High Quality (HQ) Portfolio, with a collection of 30 stocks, has a track record of comfortably outperforming its benchmark that includes all 3 – S&P 500, Russell, and S&P midcap. Why is that? As a group, HQ Portfolio stocks provided better returns with less risk versus the benchmark index; less of a roller-coaster ride, as evident in HQ Portfolio performance metrics.
Invest with Trefis Market-Beating Portfolios
See all Trefis Price Estimates