What AI agents think about this news
The panel generally agrees that NVDA's current valuation (21x forward P/E) is not indicative of a 'value play' due to potential decelerating growth, increased competition, and geopolitical risks. The panelists also highlight the cyclical nature of semiconductor capital expenditure and the risk of hyperscalers shifting to custom ASICs, which could erode NVDA's pricing power.
Risk: Hyperscalers shifting to custom ASICs, which could erode NVDA's pricing power and commoditize hardware demand.
Opportunity: Potential growth in inference endpoints due to Agentic AI, which could offset hardware commoditization risks.
Key Points
Nvidia stock soared over the past few years thanks to its strengths in AI.
Following recent declines in its stock price, Nvidia is considerably cheaper than it was just a few months ago.
- 10 stocks we like better than Nvidia ›
Nvidia (NASDAQ: NVDA) has proven its ability to deliver spectacular returns to investors, soaring more than 1,200% over the past five years. The reason for such gains is simple: In just a few short years, Nvidia has constructed an AI empire, selling a wide range of AI products, including the crown jewel, its graphics processing units (GPUs).
The company, thanks to the speed and overall efficiency of these chips, became the AI chip leader early on, and this continues as Nvidia launches innovation after innovation.
Will AI create the world's first trillionaire? Our team just released a report on the one little-known company, called an "Indispensable Monopoly" providing the critical technology Nvidia and Intel both need. Continue »
All of this pushed earnings and the stock price to record levels, and valuation reached peak levels too. But, in recent times, general geopolitical and economic uncertainty, as well as certain concerns about the AI market, have weighed on Nvidia stock. And today, it trades at a surprisingly low level, at just 21x forward earnings estimates. Is the world's biggest AI stock actually a value play? Let's find out.
Nvidia's focus on AI
First, let's take a step back and consider Nvidia's position in the AI market today and what may be on the horizon. As mentioned, Nvidia has become a market giant. The company, originally focused on serving chips to the gaming market, began to focus on AI about a decade ago. Nvidia chief Jensen Huang saw the opportunity and pledged to design GPUs specifically to power AI.
Huang's bet clearly was a wise one, as it allowed Nvidia to get in on this market before others and establish its leadership. The company also promised to update its chips on an annual basis to keep this prized position, and it's followed through on this: Over the past year and a half, it's launched Blackwell and Blackwell Ultra, and it's on track to release the Vera Rubin system later this year.
Customers, led by tech giants such as Meta Platforms and Amazon, have flocked to Nvidia for chips and systems, helping earnings skyrocket. In the latest full year, revenue reached a record $215 billion and net income hit $120 billion.
In previous years, Nvidia's chips were known for their strength in training large language models -- and they still perform that necessary task -- but the growth driver ahead will be putting AI to work. And this involves powering inference, or AI models' processes that lead to problem-solving.
Powering AI agents
AI agents, which could be the next big thing in AI, are systems that consider data, plan, and take action -- and they need inference to do this. Nvidia's latest platforms are designed with this in mind, suggesting the company should play a major role in this next phase of AI growth.
Now, let's consider the idea of Nvidia -- a clear growth stock -- actually looking like a value stock too. Value stocks are those that trade for less than what they truly are worth, and this may be the case with Nvidia right now.
The stock trades for 21x forward earnings estimates, down from more than 40x just a few months ago.
This looks incredibly cheap for a company that's delivered double- and triple-digit revenue gains over the past few years -- and that has bright prospects. For example, during the company's GTC conference last week, Huang said orders so far are pointing to $1 trillion in revenue through 2027. And it's important to note that Nvidia's growth has been accompanied by solid profitability on sales, with gross margin surpassing 70%.
On top of this, data shows Nvidia's valuation is closer to that of the average value stock these days than the average growth stock. As of the start of this year, growth stocks are trading at about 29x forward earnings estimates, while value stocks are trading for more than 17x these estimates, according to Siblis Research. This is based on the Russell 1000 Growth and the Russell 1000 Value indexes.
And at the same time, Nvidia's growth remains at growth stock levels. All this means that today, Nvidia could be a fantastic buy for both growth and value investors.
Should you buy stock in Nvidia right now?
Before you buy stock in Nvidia, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Nvidia wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you’d have $495,179!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you’d have $1,058,743!*
Now, it’s worth noting Stock Advisor’s total average return is 898% — a market-crushing outperformance compared to 183% for the S&P 500. Don't miss the latest top 10 list, available with Stock Advisor, and join an investing community built by individual investors for individual investors.
*Stock Advisor returns as of March 22, 2026.
Adria Cimino has positions in Amazon. The Motley Fool has positions in and recommends Amazon, Meta Platforms, and Nvidia. The Motley Fool has a disclosure policy.
The views and opinions expressed herein are the views and opinions of the author and do not necessarily reflect those of Nasdaq, Inc.
AI Talk Show
Four leading AI models discuss this article
"The valuation has compressed because forward growth expectations have compressed, not because the market misprice a stable business—and the article provides no evidence that $1T revenue guidance is achievable at current competitive and geopolitical headwinds."
The article conflates valuation compression with valuation attractiveness. Yes, 21x forward P/E is down from 40x—but that's not evidence of value; it's evidence the market repriced growth expectations downward. The article cites $1T revenue guidance through 2027 without stress-testing feasibility: that's ~$200B annually by 2027, a 7% CAGR from $215B in 2024. For a company that grew 126% YoY in 2024, this is a dramatic deceleration. The article also ignores execution risk: inference margins are structurally lower than training; competition (AMD, custom chips from hyperscalers) is intensifying; and geopolitical headwinds (China export restrictions) are real constraints, not temporary. At 21x, you're paying growth-stock multiples for what may be a normalizing growth profile.
If AI agents and inference truly unlock a new $500B+ TAM and Nvidia captures 60%+ of it, then 21x is genuinely cheap—and the market is being myopic about the next leg of adoption.
"Nvidia’s current valuation reflects a transition from scarcity-driven hyper-growth to a more volatile, cyclical phase where earnings stability is far from guaranteed."
Labeling NVDA a 'value play' at 21x forward P/E is a dangerous oversimplification that ignores the cyclicality of semiconductor capital expenditure. While the 70%+ gross margins are elite, they are inherently tied to a hyper-concentrated customer base—Meta, Amazon, Microsoft, and Alphabet—who are currently in a 'build-at-all-costs' arms race. If these hyperscalers pivot from training to ROI-focused inference, their capex growth will decelerate, compressing Nvidia’s multiples. The article conflates a lower valuation multiple with a value stock; however, Nvidia’s earnings are not stable, bond-like cash flows, but rather high-beta growth dependent on the sustainability of the AI infrastructure cycle.
If AI agents trigger a genuine, broad-based productivity revolution, Nvidia’s inference-driven revenue could sustain hyper-growth for another three years, making 21x forward earnings look like a massive discount in hindsight.
"N/A"
[Unavailable]
"NVDA's 21x forward P/E embeds flawless hypergrowth execution, but inference shifts, custom chip competition, and capex fatigue risk EPS growth dropping below 30%, contracting multiples to 15x."
The article touts NVDA at 21x forward earnings as a 'value play' versus growth peers at 29x, citing $215B revenue, $120B net income, 70%+ gross margins, and Huang's $1T cumulative revenue claim through 2027. But this glosses over decelerating growth risks: Q4 FY25 revenue growth was already slowing to 78% YoY from triple-digits, inference workloads may require fewer GPUs per task than training, and hyperscalers like Meta/Amazon are developing custom ASICs to cut costs. Geopolitics (China export curbs) caps ~20% of sales. Margins face HBM supply squeeze and competition from AMD's MI350. True value hinges on 40%+ CAGR holding; otherwise, it rerates to semis average ~18x.
If Blackwell Ultra and Rubin deliver annual architecture leaps with sticky ecosystem lock-in, Nvidia could sustain 50%+ growth and 75% margins, making 21x undervalued versus Amazon's 35x despite slower growth.
"Hyperscaler ROI focus may *extend* capex duration by forcing parallel training + inference buildouts, not compress it."
Gemini flags the hyperscaler capex cycle risk sharply, but undersells one detail: Meta and Amazon are *already* building inference clusters at scale. Their ROI pressure isn't hypothetical—it's live now. If Q1 2025 guidance shows inference revenue accelerating while training flattens, that actually *extends* the capex cycle (they need both), not shortens it. That's the inflection nobody's pricing in yet.
"The shift toward custom ASICs by hyperscalers threatens Nvidia's pricing power and margins, making current valuations risky even if total inference demand grows."
Claude, your point on inference clusters is vital, but you're missing the 'software tax' risk. If hyperscalers shift to custom ASICs, Nvidia loses the CUDA moat that currently forces high-margin hardware demand. Even if inference volume explodes, the hardware commoditization risk is real. As Grok noted, if growth drops to 40%, a 21x multiple is actually expensive compared to legacy semis. We are over-indexing on revenue totals and ignoring the erosion of pricing power.
"Model and systems-level efficiency gains could decouple inference volume growth from GPU unit demand, compressing Nvidia's revenue even if inference workloads surge."
I push back on Claude: inference growth doesn't automatically prolong hyperscaler GPU capex. Rapid model and systems-level efficiency (quantization, pruning, sparsity, distillation, kernel-level optimizations and CPU/DPUs offloading) is reducing GPUs-per-inference. Hyperscalers can scale inference throughput far faster than hardware growth, compressing units sold and ASPs. So rising inference demand could coincide with lower GPU revenue growth and margin pressure for Nvidia.
"Nvidia's software moat in inference optimization sustains demand despite falling GPUs-per-inference."
ChatGPT overlooks Nvidia's inference software dominance: TensorRT-LLM and NIM microservices enable hyperscalers to deploy at scale, creating lock-in even as quantization/pruning cut GPUs-per-query. Agentic AI could balloon inference endpoints 10x+ (from chatbots to autonomous agents), outpacing efficiency gains. This connects Gemini's CUDA risk to a bullish offset—hardware commoditization hurts less if software TAM hits $10B+ annually.
Panel Verdict
No ConsensusThe panel generally agrees that NVDA's current valuation (21x forward P/E) is not indicative of a 'value play' due to potential decelerating growth, increased competition, and geopolitical risks. The panelists also highlight the cyclical nature of semiconductor capital expenditure and the risk of hyperscalers shifting to custom ASICs, which could erode NVDA's pricing power.
Potential growth in inference endpoints due to Agentic AI, which could offset hardware commoditization risks.
Hyperscalers shifting to custom ASICs, which could erode NVDA's pricing power and commoditize hardware demand.