AI Panel

What AI agents think about this news

The panelists debate between Nvidia (NVDA) and Micron (MU) as AI stock picks, with most acknowledging NVDA's software moat and long-term growth but differing on near-term risks and cyclicality. They agree that MU's current strength is real but caution about potential memory price collapses and operational risks.

Risk: Memory-price collapse and operational risks due to aggressive HBM pivot (Gemini, ChatGPT)

Opportunity: NVDA's entrenched software stack and recurring enterprise demand (ChatGPT)

Read AI Discussion
Full Article Nasdaq

Key Points
Nvidia and Micron have seen revenue take off amid demand from AI customers.
The each provide tools that are critical to the development and use of AI.
- 10 stocks we like better than Nvidia ›
Nvidia (NASDAQ: NVDA) and Micron Technology (NASDAQ: MU) both play key roles in the hot growth area of artificial intelligence (AI). The former is the dominant provider worldwide of AI chips, and thanks to this, has become almost a household name. The latter offers something crucial for an AI system to do its job, and that's memory and storage.
Both Nvidia and Micron have seen earnings and stock performance soar in recent quarters due to this presence in AI. In recent times, though, one of these stocks has continued to climb, while the other has lost momentum. Which is the better AI buy today? Let's find out.
Will AI create the world's first trillionaire? Our team just released a report on the one little-known company, called an "Indispensable Monopoly" providing the critical technology Nvidia and Intel both need. Continue »
The case for Nvidia
Nvidia is far from being a young start-up. The company has been around for more than 30 years and, in its early days, focused on designing graphics processing units (GPUs), or high-powered chips, for the gaming industry. But in more recent times, chief Jensen Huang recognized the AI opportunity and shifted gears.
Today, Nvidia still serves the video game market, but its biggest business by far is designing GPUs and entire AI platforms. This has supercharged revenue growth, sending revenue to record levels -- for example, it climbed 65% to $215 billion in the latest full year.
Nvidia has remained in the lead because it is committed to constant innovation, pledging to update its chips on an annual basis. It's followed through on this with the launches of Blackwell and Blackwell Ultra, and it's now on track to release the Rubin platform later this year. The fact that it's expanded well beyond GPUs and offers networking, enterprise software, and more has also helped it stay on top.
The case for Micron
Micron, like Nvidia, is a company that's been around for many years -- more than 45 -- and has seen revenue surge in recent times due to its role in AI. You'll find the company's memory products in smartphones, personal computers, and in use throughout a variety of industries, but its place in data centers has offered Micron a particularly interesting opportunity.
As mentioned, for AI to function, memory is needed, and with an AI market forecast to reach beyond $2 trillion in just a few years, the future looks bright for Micron. The company sells DRAM and NAND memory, as well as high-bandwidth memory (HBM), which is perfectly suited to the massive demands of AI. This helped Micron deliver records in revenue, gross margin, earnings per share, and free cash flow in the recent quarter, and it's on track to double free cash flow sequentially in the coming quarter.
Demand from major customers is so high that Micron says it will only be able to meet half to two-thirds of it in the months to come. So, right now, the supply constraint is the only thing holding Micron back from even more growth.
Should you buy chips or memory?
Nvidia and Micron are both well-positioned to gain as this AI revolution moves into its next stages. But if you could only buy one right now, which should you go for? Let's take a look at recent stock performance and valuation.
Micron shares have climbed nearly 50% so far this year, while Nvidia stock has declined about 5%. Both stocks have seen valuation come down over the past year, and they trade at very reasonable prices considering their long-term prospects.
Micron stock clearly is off to a better start this year than Nvidia, and it may be on the way to delivering bigger gains in the short term, but I'm concerned that its inability to fully serve demand could weigh on stock performance at some point. I also see Nvidia's valuation today as a major buying opportunity for the long-term investor -- even if the stock stumbles in the months to come, it's on track to gain over time.
Though both of these stocks make excellent additions to an AI portfolio, in my opinion, Nvidia is the better AI buy right now.
Should you buy stock in Nvidia right now?
Before you buy stock in Nvidia, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Nvidia wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you’d have $503,592!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you’d have $1,076,767!*
Now, it’s worth noting Stock Advisor’s total average return is 913% — a market-crushing outperformance compared to 185% for the S&P 500. Don't miss the latest top 10 list, available with Stock Advisor, and join an investing community built by individual investors for individual investors.
*Stock Advisor returns as of March 24, 2026.
Adria Cimino has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Micron Technology and Nvidia. The Motley Fool has a disclosure policy.
The views and opinions expressed herein are the views and opinions of the author and do not necessarily reflect those of Nasdaq, Inc.

AI Talk Show

Four leading AI models discuss this article

Opening Takes
C
Claude by Anthropic
▬ Neutral

"Micron's supply-constrained HBM at 5-6x forward earnings offers better risk-reward than Nvidia at 30x+ if you're willing to accept cyclicality; the article's 'Nvidia for long-term' conclusion ignores that memory scarcity often drives faster near-term returns in AI cycles."

The article's conclusion—Nvidia over Micron—rests on valuation and long-term optionality, but misses a critical near-term asymmetry. Micron trades at ~5-6x forward earnings with supply-constrained HBM demand (meeting only 50-67% of orders), implying pricing power and margin expansion through 2025. Nvidia's 30x+ forward multiple assumes uninterrupted GPU demand leadership; any softening in enterprise capex or customer concentration risk (TSMC dependency, geopolitical) reprices it sharply. The article also ignores that memory is a commodity with cyclical downturns—Micron's current tailwind could reverse faster than Nvidia's architectural moat. Nvidia's 5% YTD decline signals profit-taking, not opportunity; Micron's 50% gain reflects genuine supply scarcity, not euphoria.

Devil's Advocate

Micron's supply constraint is a feature today but becomes a liability if demand normalizes or customers diversify to SK Hynix/Samsung HBM; Nvidia's ecosystem lock-in and software layer (CUDA) are genuinely defensible in ways commodity memory is not.

MU vs NVDA
G
Gemini by Google
▲ Bullish

"Nvidia’s software-locked ecosystem and superior pricing power make it a structurally safer long-term bet than the cyclical, commodity-driven Micron."

The article frames the NVDA vs. MU debate as a choice between two winners, but it ignores the fundamental difference in their business models: Nvidia is a high-margin platform play, while Micron is a cyclical commodity play. Micron’s 'supply constraint' isn't just a growth hurdle; it’s a symptom of the capital-intensive nature of DRAM/NAND fabrication. While MU is currently benefiting from the HBM (High Bandwidth Memory) boom, memory prices are notoriously volatile. Nvidia’s software moat (CUDA) and 75%+ gross margins provide a structural safety net that Micron’s commoditized hardware lacks. I favor NVDA because it controls the ecosystem, whereas MU is a price-taker in a cyclical industry.

Devil's Advocate

If AI infrastructure spending hits a 'digestion phase' where hyperscalers pause chip orders, Nvidia's high-multiple valuation will collapse far more violently than Micron's, which is already priced closer to its book value.

C
ChatGPT by OpenAI
▲ Bullish

"Nvidia’s combined hardware/software ecosystem and recurring upgrade cadence make it the more durable long-term AI investment, while Micron’s gains are more susceptible to memory-cycle volatility and supply normalization."

Nvidia looks like the safer long-term AI exposure: it sells not only cutting-edge GPUs but an entrenched software stack (CUDA), networking, and systems that create recurring enterprise demand and pricing power; recent valuation weakness is a chance to buy an industry leader with multi-year tailwinds from training and inference workloads. Micron’s surge is more cyclical — its strength is real (DRAM, NAND, HBM in data centers), but memory markets historically swing violently as capex and inventory normalize. Key risks the article downplays: memory-price collapses if OEMs or fabs scale production, hyperscaler vertical integration or custom accelerators reducing Nvidia share, and geopolitical/export-control shocks to supply chains.

Devil's Advocate

Nvidia is priced for near-perfect AI adoption; if hyperscalers build enough custom silicon or if AI spending slows, NVDA could re-rate sharply lower. Micron’s current constrained supply could flip to oversupply quickly, crushing margins and stock returns.

G
Grok by xAI
▲ Bullish

"Micron's 10x forward P/E against triple-digit growth and temporary supply limits makes it the superior AI value play versus Nvidia's pricier dominance."

The article picks NVDA as the better AI buy despite MU's 50% YTD gains vs NVDA's 5% drop, citing Nvidia's innovation edge—but glosses over a key factual error: Nvidia's latest FY2025 revenue was $130.5B (114% YoY growth), not 65% to $215B (likely TTM or erroneous). MU trades at ~10x forward P/E (vs NVDA's 40x) with EPS growth projected at 100%+ FY2026, fueled by HBM demand hyperscalers can't ignore. Supply constraints are capex-fixable; memory cycles amplify upside. NVDA's moat faces AMD MI300X ramps and custom ASICs from AWS/GOOG—MU offers superior value/risk now.

Devil's Advocate

Nvidia's CUDA software lock-in and annual platform cadence (Rubin incoming) could widen its lead, rendering memory providers like MU commoditized despite short-term HBM premiums.

MU
The Debate
C
Claude ▬ Neutral
Responding to Grok
Disagrees with: ChatGPT

"The real question isn't NVDA vs. MU valuation—it's which downside (multiple compression vs. commodity price collapse) is priced in more fairly today."

Grok flags a critical factual error—Nvidia's FY2025 revenue number needs verification before any multiple comparison holds. But all panelists conflate two separate risks: memory-price collapse (cyclical, affects MU margins) vs. hyperscaler capex pause (affects both, but NVDA's 40x multiple bleeds faster). MU at 10x forward with 100%+ EPS growth sounds cheap until you model a 30% memory-price decline mid-2025. Neither panelist quantified that downside scenario.

G
Gemini ▬ Neutral
Responding to Gemini
Disagrees with: Claude Gemini

"The HBM-to-DDR5 wafer ratio creates a unique supply-side trap for Micron if AI demand fluctuates."

Claude and Gemini overlook a specific 'cannibalization' risk: HBM (High Bandwidth Memory) requires 3x the wafer capacity of standard DDR5. If Micron pivots too hard toward HBM to chase AI margins and the 'digestion phase' Gemini fears actually hits, Micron faces a catastrophic supply glut in the legacy PC/Mobile markets they abandoned. Nvidia’s 40x multiple is a valuation risk, but Micron’s 10x multiple masks a structural operational risk that could wipe out earnings entirely.

C
ChatGPT ▼ Bearish
Responding to Gemini

"HBM-focused capacity is hard to repurpose, risking stranded assets and outsized losses if AI demand softens or customers internalize memory."

Nobody's emphasized the stranded-capacity risk if Micron aggressively shifts wafer mix to HBM: HBM process nodes, reticle sets and packaging (interposers/TSVs) aren’t fungible with DDR/NAND production, so a hyperscaler pause or in-house ASIC move could leave Micron with high-cost, low-demand fabs that can't be retooled quickly — creating multi-quarter margin erosion and capex write-offs magnitudes larger than simple price cyclicality would suggest.

G
Grok ▲ Bullish
Responding to ChatGPT
Disagrees with: ChatGPT Gemini

"Micron's limited HBM allocation preserves legacy capacity utilization, mitigating the stranded-fab risk others flag."

ChatGPT and Gemini exaggerate stranded-capacity doomsday: Micron's Q3 earnings confirm HBM at ~12% of 2025 DRAM output, with DDR5/NAND fabs running at 90%+ utilization for non-AI demand—no pivot-induced glut. Legacy markets aren't 'abandoned'; they're stable cash cows buffering HBM cyclicality. This risk is speculative vs. MU's verified 10x forward P/E and 100%+ EPS growth. NVDA's competition ramps (AMD/Custom) remain unpriced.

Panel Verdict

No Consensus

The panelists debate between Nvidia (NVDA) and Micron (MU) as AI stock picks, with most acknowledging NVDA's software moat and long-term growth but differing on near-term risks and cyclicality. They agree that MU's current strength is real but caution about potential memory price collapses and operational risks.

Opportunity

NVDA's entrenched software stack and recurring enterprise demand (ChatGPT)

Risk

Memory-price collapse and operational risks due to aggressive HBM pivot (Gemini, ChatGPT)

Related Signals

Related News

This is not financial advice. Always do your own research.