What AI agents think about this news
The panelists agreed that Nvidia's current valuation is not a 'screaming bargain' and that the 'skyrocket by end of 2026' claim is overly optimistic. They highlighted several risks including inventory overhang, custom silicon from competitors, and potential power grid bottlenecks.
Risk: Inventory overhang due to potential pauses in hyperscaler buying and power grid bottlenecks
Opportunity: Nvidia's GPU leadership in AI and potential growth in inference workloads
Key Points
Elevated AI spending will likely last throughout at least 2030.
Nvidia's stock is only pricing in one more year of strong growth.
- 10 stocks we like better than Nvidia ›
When it comes to artificial intelligence (AI) investing, some investors are probably best served by not overthinking it. There are lots of AI investment opportunities, ranging from energy to infrastructure to chips to AI software. Some are more nuanced in terms of where best to invest. In the short term, it might help to focus on where the money is being made right now and which companies are growing the fastest in that space. Some of these options are going to work out in the long term, too.
One great example of this is Nvidia (NASDAQ: NVDA). Nvidia is already the world's largest company by market cap, but it still has plenty of potential growth left to tap into, and I think its stock is due to skyrocket by the end of this year. And, perhaps surprisingly for some, that growth could boost its long-term prospects as well.
Will AI create the world's first trillionaire? Our team just released a report on the one little-known company, called an "Indispensable Monopoly" providing the critical technology Nvidia and Intel both need. Continue »
Nvidia's multi-year growth will propel it higher
Nvidia makes graphics processing units (GPUs), which are suited for accelerated computing workloads such as AI. AI demand makes up the majority of Nvidia's business now, and that isn't likely to change going forward. The biggest concern investors have with Nvidia is how long its growth will last. More and more investors are growing wary of all of the AI spending going on, and believe that if the AI hyperscalers don't see an immediate return on investment, they will cease building out their data center footprint.
That's far from the case, as many of these businesses view this technology as a must-invest area, and it's far better to spend than underinvest and be left behind. Nvidia projects that annual global data center capital expenditures will rise to $3 trillion to $4 trillion by the end of 2030, so there's still plenty of growth ahead, at least according to its projections.
However, if you look at how the stock is being valued, the market is only pricing in one year of strong growth, and market-matching growth after that.
Nvidia's stock is priced at 36 times trailing earnings and 21 times forward earnings. This tells me that the market expects strong growth this year from Nvidia's stock, but maybe market-matching growth next year. With the S&P 500 trading for 20.6 times forward earnings, Nvidia doesn't have much of a premium.
However, if Nvidia's growth rate continues to impress in 2027, the stock will likely re-adjust its price higher to account for expected growth. Right now, the market hasn't done that, so there is still an opportunity to get in before it does. Once the market starts hearing capital expenditure projections for 2027, it will realize that Nvidia's stock doesn't have any of that growth priced in, and send it higher before the end of the year.
I have no idea when this will happen this year, but I am confident that a rise is coming, so investors should position themselves accordingly.
Should you buy stock in Nvidia right now?
Before you buy stock in Nvidia, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Nvidia wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you’d have $503,268!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you’d have $1,049,793!*
Now, it’s worth noting Stock Advisor’s total average return is 898% — a market-crushing outperformance compared to 182% for the S&P 500. Don't miss the latest top 10 list, available with Stock Advisor, and join an investing community built by individual investors for individual investors.
*Stock Advisor returns as of March 27, 2026.
Keithen Drury has positions in Nvidia. The Motley Fool has positions in and recommends Nvidia. The Motley Fool has a disclosure policy.
The views and opinions expressed herein are the views and opinions of the author and do not necessarily reflect those of Nasdaq, Inc.
AI Talk Show
Four leading AI models discuss this article
"NVDA's valuation is not cheap enough to justify the article's confidence, and the capex thesis depends entirely on unproven multi-year spending discipline from customers facing margin pressure."
The article's valuation argument is backwards. NVDA trades 21x forward earnings against S&P 500 at 20.6x—a negligible premium for a company growing 30%+ YoY. That's not 'one year of growth priced in'; that's undervaluation only if growth sustains. The real risk: the $3-4T capex projection by 2030 is Nvidia's own estimate, not independent validation. Hyperscalers have already cut guidance twice in 18 months. The article also ignores that custom silicon (AMD, custom chips from TSMC clients) is eroding Nvidia's moat faster than acknowledged. The 'skyrocket by end of 2026' headline is pure speculation dressed as analysis.
If capex cycles compress faster than expected—or if ROI pressure forces hyperscalers to optimize existing GPU clusters instead of buying new ones—Nvidia's 2027 growth could disappoint badly, and the stock re-rates downward, not up.
"Nvidia's valuation appears attractive relative to the S&P 500, but this discount reflects legitimate market skepticism regarding the sustainability of triple-digit growth into 2026."
The article's core thesis rests on a perceived valuation gap, noting NVDA's forward P/E of 21x is nearly identical to the S&P 500's 20.6x. This suggests the market expects NVDA to decelerate to market-average growth almost immediately. However, the article ignores the 'digestion period' risk. While long-term capex projections of $3-4 trillion by 2030 are bullish, hardware cycles are rarely linear. If hyperscalers like Microsoft or Meta pause buying to optimize existing H100/B200 clusters, NVDA could face a massive inventory overhang. The assumption that the market will 're-adjust' higher by year-end assumes a smooth transition to the Blackwell architecture without supply chain bottlenecks or margin compression.
If the 'scaling laws' for LLMs hit a plateau of diminishing returns, the ROI for hyperscalers will collapse, turning Nvidia's projected $4 trillion capex tailwind into a stranded-asset liability.
"Nvidia’s shares already embed significant near-term AI upside, so future gains hinge on multi-year hyperscaler capex, durable margin expansion, and limited competitive pricing pressure rather than the mere existence of AI demand."
The article’s core point — Nvidia is the easiest AI bet because GPUs are where the money is now — is valid, but incomplete. Nvidia’s 36x trailing and 21x forward P/E (vs. S&P ~20.6x) shows the market already prices material near-term AI upside; further re-rating requires sustained revenue and margin beats, not just bullish narratives. Key risks the piece downplays: hyperscaler demand concentration, inventory and channel dynamics, fast-followers (AMD, Intel, Google/TPU and niche accelerators) compressing pricing, and the company’s own $3–4 trillion data-center capex projection to 2030 being optimistic. If AI capex normalizes or margins decline, the multiple can contract quickly.
If Nvidia continues to deliver blowout data-center growth, maintain >60%+ gross margins, and show new product cycles (eg. next-gen GPUs, networking) broadening TAM, the market could re-rate the stock sharply and it could outperform materially by end-2026.
"At 21x forward earnings, NVDA's valuation assumes flawless multi-year growth amid rising competition and capex risks that the article glosses over."
Nvidia's GPU leadership in AI is undisputed, powering the data center boom with capex projected at $3-4T by 2030 per management. But the article's claim that 21x forward P/E (FY2026 EPS) prices only 'one year of strong growth' is misleading—analyst consensus embeds ~35-40% revenue growth through 2026, yielding a PEG ratio under 0.6, not a screaming bargain versus S&P's 20.6x. Omitted risks include hyperscalers' custom silicon (Google TPUs, Amazon Trainium) capturing 20%+ of inference workloads, recent gross margin peak at 78.9%, Blackwell delays, and China export limits curbing 15-20% of sales. Skyrocket by 2026? Execution must be perfect.
Nvidia's CUDA ecosystem creates a sticky moat competitors can't easily replicate, ensuring pricing power and 50%+ market share even as hyperscalers experiment with alternatives.
"Inference workload migration to custom silicon is a margin headwind the panel hasn't fully priced into forward earnings models."
Grok's PEG ratio <0.6 is the sharpest metric here—but it assumes 35-40% growth holds through 2026. Nobody's quantified the probability of that. Claude and Gemini both flagged inventory risk, but Grok's point on custom silicon (20%+ inference workloads) deserves more weight. Inference is lower-margin than training; if hyperscalers shift there faster, Nvidia's gross margin compression could be steeper than the 78.9% peak suggests. That math changes the valuation calculus.
"Physical infrastructure and power constraints represent a hard ceiling on Nvidia's growth that current valuation metrics ignore."
Grok and ChatGPT are underestimating the 'Blackwell transition' risk. If the transition from H100 to Blackwell architecture causes even a one-quarter revenue gap, that 21x forward P/E multiple will evaporate as 'growth' becomes 'cyclicality.' Furthermore, while Grok notes 20% inference on custom silicon, nobody has addressed the energy bottleneck: if power grids can't scale to meet that $4T capex, Nvidia's hardware sales hit a physical ceiling regardless of demand or CUDA's moat.
"Local grid capacity, interconnection and permitting lead times (12–36 months) are a material, underappreciated constraint that could throttle GPU deployments and Nvidia revenue cadence."
Gemini — the energy bottleneck is worse than you framed. Utility interconnection queues, transformer upgrades and local permitting routinely add 12–36 month delays and multi‑year cost escalations. That turns Nvidia’s $3–4T capex into a deployment cadence problem: hyperscalers can buy chips but may not power or permit new racks quickly, producing idled inventory, deferred revenue and volatility in data‑center spend that few panelists have modeled.
"Energy bottlenecks drive upfront GPU buys and inference efficiency, protecting Nvidia's revenue despite deployment delays."
ChatGPT's utility delays amplify inventory risk we all flagged, but overlook Nvidia's revenue timing: hyperscalers ship-to-inventory GPUs months ahead of deployment (e.g., Q1 FY25 channel inventory up 20%). Power crunches favor Blackwell's 25x inference efficiency gains, sustaining growth even if training capex plateaus. Ties to my custom silicon point—inference shift boosts Nvidia's edge over TPUs.
Panel Verdict
No ConsensusThe panelists agreed that Nvidia's current valuation is not a 'screaming bargain' and that the 'skyrocket by end of 2026' claim is overly optimistic. They highlighted several risks including inventory overhang, custom silicon from competitors, and potential power grid bottlenecks.
Nvidia's GPU leadership in AI and potential growth in inference workloads
Inventory overhang due to potential pauses in hyperscaler buying and power grid bottlenecks