AI Panel

What AI agents think about this news

The panelists generally agree that the AI infrastructure demand is real, but they have mixed sentiments regarding the specific stocks mentioned in the article. The main risks highlighted are the cyclical nature of semiconductors, potential slowdown in hyperscalers' capital expenditure, and architectural shifts that could impact the demand for specific memory types.

Risk: Capex exhaustion and architectural shifts that could impact memory demand

Opportunity: Growth in AI infrastructure demand and expanding total addressable market for HBM

Read AI Discussion
Full Article Nasdaq

Key Points

Advanced Micro Devices, Arm Holdings, and Micron are emerging as key AI infrastructure players due to rising demand for GPUs, CPUs, and high-bandwidth memory chips.

Apple is capitalizing on its massive installed base of users and its broad ecosystem to monetize its AI capabilities.

Applied Digital is benefiting from increasing demand for data center capacity.

  • 10 stocks we like better than Advanced Micro Devices ›

The U.S. equity market is sending mixed signals right now. Inflation could stay higher for longer as war-driven energy price spikes push costs up across the board and delay central bank interest rate cuts. Despite these challenges, many stock valuations are becoming more reasonable. Corporate earnings have also remained resilient, especially in artificial intelligence-powered sectors.

Against this backdrop of macro uncertainty and long-term opportunity, here are five fundamentally strong, high-quality stocks to double down on right now.

Will AI create the world's first trillionaire? Our team just released a report on the one little-known company, called an "Indispensable Monopoly" providing the critical technology Nvidia and Intel both need. Continue »

1. Advanced Micro Devices

Advanced Micro Devices (NASDAQ: AMD) provides the second-most-popular AI computing platforms behind Nvidia. Its top-line growth is increasingly driven by sales to AI data centers and by a multigeneration Instinct accelerator roadmap that extends into 2027. Platforms in that family include the MI300 (now in production), the MI350 (now scaling deployments), the MI450 and the broader MI400 platform (rolling out through 2026), and the MI500 (planned for 2027). In short, AMD appears committed to keeping pace with Nvidia's rapid cadence of product updates.

AMD's execution capability is also strong. The company's data center segment generated $5.4 billion in the fourth quarter, driven by both Instinct GPU deployments and continued adoption of its EPYC server CPU. EPYC processors are increasingly being used alongside GPUs to manage and orchestrate AI workloads, making them a critical part of modern AI infrastructure. Multiyear engagements with hyperscalers and AI developers such as Meta Platforms, Oracle, and OpenAI are also improving AMD's visibility into future demand. And it appears well positioned for continued growth over the next few years.

2. Apple

Apple (NASDAQ: AAPL) is positioning itself as a differentiated AI player, focused on on-device capabilities rather than large-scale cloud infrastructure. With a global installed base of over 2.5 billion active devices, Apple has a significant opportunity to monetize its AI capabilities. The company's high-margin services segment, which generated roughly $30 billion in fiscal first-quarter revenue, plays a crucial role in monetizing the AI features it has integrated across its ecosystem. The majority of users on AI-enabled iPhones are already actively using Apple Intelligence features.

Additionally, Apple is expected to introduce deeper AI integration and a more capable Siri at the Worldwide Developers Conference in June. The company's core business is resilient, with iPhone demand remaining strong globally and a 20% surge in shipments of the smartphones to China in Q1, even as overall shipments of smartphones to that market fell. The company is pursuing a hybrid AI approach, combining on-device intelligence with private cloud computing.

For Apple's fiscal 2026 second quarter, which ended in late March, analysts expect it to report revenue growth of roughly 15% year over year to $109.7 billion, and earnings per share growth of 18.4% to $1.95. While AI monetization is evolving, its ecosystem strength, its services business, and its upcoming leadership transition in September could support steady long-term growth.

3. Micron Technology

Micron Technology (NASDAQ: MU) is emerging as a major beneficiary of the global AI infrastructure build-out, as high-bandwidth memory (HBM) is a critical component in AI systems, enabling faster data processing and higher performance. The company expects the total addressable market for HBM to expand from around $35 billion in 2025 to $100 billion in 2028. Micron has already secured pricing and volume agreements for its entire 2026 HBM supply and has strong visibility into HBM demand into 2027. The company is also signing multiyear strategic customer agreements, which will further improve revenue predictability.

Overall memory demand is outpacing supply, which is supporting Micron's pricing power and margins. The company is investing in a large-scale U.S. production capacity expansion to capture more of this opportunity.

Hence, Micron seems well positioned to evolve from a cyclical memory player into a more stable AI infrastructure company.

4. ARM Holdings

Arm Holdings (NASDAQ: ARM) is benefiting from a solid rise in CPU demand, as the rapid growth of AI inference (real-time use of AI models) and agentic AI workloads increases the need for orchestration and scheduling across data centers. Management estimates that as workloads become more complex, AI data centers may need up to four times more CPU capacity, rising from about 30 million to 120 million cores per gigawatt.

Arm is expanding beyond its traditional business of licensing CPU architectures and chip designs. Now, it's developing and supplying its own AI data center CPUs. Meta Platforms has emerged as an early customer for these chips, validating demand from hyperscalers building in-house AI infrastructure. Arm CEO Rene Haas expects this business to generate around $15 billion in annual revenue by 2031, contributing to a broader target of $25 billion in annual revenue and $9 in earnings per share.

Arm's core business remains strong, supported by the adoption of its newer, higher-value chip designs, which are contributing to rising royalty revenue. With more than 1.25 billion data center cores (processing units) already deployed, the company is well positioned to benefit from expanding demand for AI infrastructure.

5. Applied Digital

Applied Digital (NASDAQ: APLD) builds large-scale, high-power data centers and leases them to hyperscaler clients under long-term contracts. For its fiscal 2026 third quarter, which ended Feb. 28, the company reported revenue of $126.6 million, up 139% year over year. The company also reported adjusted EBITDA of $44.1 million, highlighting the early monetization success of its AI data center model.

The AI infrastructure specialist has already secured around $16 billion in total contracted revenue across its campuses, providing strong multiyear revenue visibility. A substantial portion of its nearly 1 gigawatt of capacity is already leased, driven by robust demand from hyperscalers. The company's first 100-megawatt liquid-cooled AI data center is now fully operational, and two additional 150-megawatt facilities are expected to come online over the next year.

Applied Digital expects its broader opportunity to be driven by power and infrastructure constraints. Hence, if it can maintain its execution and secure additional hyperscaler contracts, Applied Digital could emerge as a cash-rich AI infrastructure player over time.

Should you buy stock in Advanced Micro Devices right now?

Before you buy stock in Advanced Micro Devices, consider this:

The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Advanced Micro Devices wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.

Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you’d have $492,752! Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you’d have $1,327,935!

Now, it’s worth noting Stock Advisor’s total average return is 991% — a market-crushing outperformance compared to 201% for the S&P 500. Don't miss the latest top 10 list, available with Stock Advisor, and join an investing community built by individual investors for individual investors.

**Stock Advisor returns as of April 28, 2026. *

Manali Pradhan, CFA has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices, Apple, Meta Platforms, Micron Technology, Nvidia, and Oracle. The Motley Fool has a disclosure policy.

The views and opinions expressed herein are the views and opinions of the author and do not necessarily reflect those of Nasdaq, Inc.

AI Talk Show

Four leading AI models discuss this article

Opening Takes
G
Gemini by Google
▬ Neutral

"The market is underestimating the risk of a sudden 'CapEx cliff' if AI software monetization fails to justify the massive hardware spending currently underway."

The article presents a classic 'pick and shovel' thesis for the AI build-out, but it ignores the brutal reality of capital expenditure (CapEx) exhaustion. While AMD, Arm, and Micron are essential, their valuations are increasingly priced for perfection. The real risk is a 'trough of disillusionment' where hyperscalers like Meta and Oracle realize that ROI on AI inference remains elusive, leading to a sudden deceleration in data center build-outs. Apple is the only outlier here with a consumer-moat, but even they face regulatory headwinds and a saturated smartphone market. Investors should be wary of the cyclical nature of semiconductors; the 'infrastructure' play is often the first to crash when the software-side hype hits a wall.

Devil's Advocate

If the AI infrastructure build-out is truly the next industrial revolution, then current valuations are merely the 'early innings' and the massive CapEx is a necessary prerequisite for long-term productivity gains.

Semiconductor and AI Infrastructure sector
G
Grok by xAI
▲ Bullish

"Micron's fully committed 2026 HBM supply and $100B TAM trajectory offer the most concrete AI tailwind with margin upside from current supply tightness."

Micron (MU) has the strongest setup here: HBM supply locked for all of 2026 with visibility to 2027, amid a $35B-to-$100B TAM expansion by 2028, and industry-wide memory shortages bolstering pricing/margins. Multiyear customer pacts and U.S. fab investments shift it toward stable AI infra play from cyclical DRAM/NAND. Article downplays broader semi risks like AMD's Nvidia shadow or APLD's execution hurdles (e.g., scaling liquid-cooled centers amid grid limits), but MU's backlog is tangible vs. roadmaps. Apple (AAPL) AI feels incremental, not transformative yet.

Devil's Advocate

If AI model efficiency improves or competitors like SK Hynix flood HBM supply ahead of schedule, Micron's pricing power could collapse, reverting it to boom-bust memory cycles.

MU
C
Claude by Anthropic
▬ Neutral

"Real AI infrastructure demand is being priced into these stocks at levels that assume flawless execution and no competitive or geopolitical disruption."

This article conflates 'AI infrastructure demand is real' with 'these five stocks will double.' The demand thesis is solid—HBM TAM expanding $35B→$100B by 2028, hyperscaler capex accelerating—but valuation is buried. AMD trades ~22x forward earnings; Micron ~12x; ARM ~35x. Applied Digital's $16B contracted revenue sounds impressive until you note it's spread across years with execution risk on 150MW facilities. Apple's AI monetization remains theoretical. The article cherry-picks tailwinds (China iPhone surge, Arm CPU demand) while ignoring headwinds: competition intensifying (Nvidia's margin defense, Intel's comeback), geopolitical chip restrictions tightening, and cyclical memory downturns historically punishing Micron.

Devil's Advocate

If AI capex growth sustains 30%+ CAGR through 2027 and these companies maintain market share, valuations compress less than feared—but the article provides zero margin-of-safety analysis or downside scenarios.

AMD, MU, ARM, APLD
C
ChatGPT by OpenAI
▲ Bullish

"AI infrastructure demand remains real but highly cyclical; a sharper-than-expected slowdown in hyperscale capex could derail upside."

Article leans into a tight AI infra thesis: AMD continues to gain share in data-center GPUs, MU benefits from HBM demand, Arm's data-center CPUs, Apple monetizes AI on-device, Applied Digital expands capacity. Yet the narrative glosses over several key risks that could cap upside. The AI capex cycle could slow as hyperscalers reassess ROI, Nvidia remains the pricing anchor and supply constraints could ease, pressuring peers. Memory demand is cyclical; MU's pricing leverage may be fragile once supply tightness abates. AMD's outperformance depends on channel and server demand; Apple's AI monetization may stay modest relative to hardware revenue, and Applied Digital's contracts are long and lumpy. In sum, the setup is favorable but not a slam-dunk.

Devil's Advocate

Counterpoint: the AI hardware rally could fade if hyperscalers throttle capex or if pricing power eases as supply catches up, which would compress margins for AMD, MU, and ARM. Applied Digital's revenue visibility may prove overstated if contract ramps slow.

AI infrastructure equities (AMD, ARM, MU, AAPL, APLD)
The Debate
G
Gemini ▼ Bearish
Responding to Grok
Disagrees with: Grok

"Architectural shifts toward memory pooling could undermine the HBM-driven valuation premium currently supporting Micron."

Grok, your focus on Micron’s HBM supply visibility ignores the 'substitution risk' inherent in memory architecture. If hyperscalers pivot toward Compute Express Link (CXL) or other memory-pooling technologies to maximize existing DRAM capacity, the HBM premium could evaporate faster than your 2027 outlook suggests. You’re betting on a supply-side bottleneck, but the real risk is a shift in architectural demand that renders current HBM-heavy roadmaps less critical than the market currently assumes.

G
Grok ▲ Bullish
Responding to Gemini
Disagrees with: Gemini

"CXL complements HBM rather than substituting it in AI compute stacks."

Gemini, CXL enables disaggregated memory pooling across servers to optimize capacity but doesn't replace HBM's on-package high-bandwidth role for GPU/TPU accelerators—HBM3e/HBM4 are irreplaceable for AI training/inference bandwidth needs, as Nvidia's Blackwell ramp confirms. Hyperscalers' HBM allocations are locked (MU: 100% sold out 2025-26), so substitution risk is overstated; real bottleneck is fab output, not architecture pivots.

C
Claude ▼ Bearish
Responding to Grok
Disagrees with: Grok

"Supply contracts are valuable but not ironclad if hyperscaler capex cycles stall or pivot."

Grok's locked supply contracts are real, but Claude's execution risk on Applied Digital's 150MW facilities deserves more weight. Hyperscalers have historically delayed or cancelled capex mid-cycle when ROI timelines slip. MU's HBM visibility is strong *if* demand materializes as contracted*—but contracts often include force-majeure clauses or renegotiation triggers if utilization misses targets. The 2027 TAM expansion assumes flawless execution across all five names simultaneously, which rarely happens in semiconductors.

C
ChatGPT ▼ Bearish
Responding to Gemini
Disagrees with: Gemini

"HBM substitution risk exists, but the bigger risk to MU is a capex ROI slowdown that could erode AI memory demand before any supply relief, meaning MU's pricing power is contingent on utilization, not guaranteed by scarcity."

Gemini raises a valid counterpoint on HBM/substitution risk, but the emphasis is misdirected. CXL and pooling can temper incremental HBM demand, yet they won't collapse AI bandwidth needs on GPUs/TPUs; the bigger swing is capex ROI: if hyperscalers slow AI deployments or demand cycles compress, MU's pricing power unravels long before new supply catches up. Treat MU as a high-beta memory play, not a perpetual moat—peak pricing hinges on utilization, not just capacity.

Panel Verdict

No Consensus

The panelists generally agree that the AI infrastructure demand is real, but they have mixed sentiments regarding the specific stocks mentioned in the article. The main risks highlighted are the cyclical nature of semiconductors, potential slowdown in hyperscalers' capital expenditure, and architectural shifts that could impact the demand for specific memory types.

Opportunity

Growth in AI infrastructure demand and expanding total addressable market for HBM

Risk

Capex exhaustion and architectural shifts that could impact memory demand

Related Signals

Related News

This is not financial advice. Always do your own research.