Marvell Technology: A Multi-Year AI and Networking Growth Story Comes Into Sharper Focus

Marvell Technology’s long-term positioning in AI compute and high-speed networking appears increasingly solid following a recent management meeting with investors hosted by JPMorgan. Senior leadership reaffirmed that the company’s most important hyperscale engagements remain firmly on track, pushing back against recent market “noise” and reinforcing confidence in Marvell’s multi-year growth trajectory.

Hyperscaler XPU Programs Remain Intact

Management emphasized that Marvell’s custom AI accelerator (XPU ASIC) relationships with Amazon Web Services and Microsoft are not only intact, but expanding across multiple technology generations. For AWS, Marvell has already secured purchase orders covering all of calendar year 2026 for the next-generation Trainium 3 XPU ASIC, with volume ramp expected in the second half of the year. At Microsoft, the company’s 3nm Maia AI XPU program is progressing as planned, with production ramping in the back half of 2026 and extending into 2027.

Importantly, Marvell is already deep into design work on next-generation 2nm XPU programs for both customers. This underscores the depth and durability of these hyperscale partnerships and suggests a long runway of recurring, high-value silicon content.

XPU Attach Opportunities Add a New Growth Lever

Beyond the core XPU silicon, Marvell highlighted significant incremental opportunities from “attach” products such as smartNICs and CXL controllers. These components are expected to begin contributing meaningfully to revenue starting next year, with management targeting as much as $2 billion in XPU attach revenue by calendar year 2028. This expansion of content per system meaningfully increases Marvell’s dollar exposure to AI infrastructure growth.

Optical Networking Positioned for Above-Capex Growth

Marvell’s leadership in electro-optical networking continues to strengthen. The company expects its optical business to grow faster than overall data center capital expenditures, driven by strong demand for next-generation PAM4 DSPs. In particular, 1.6Tbps DSP deployments for customers such as NVIDIA and Google are expected to ramp aggressively, while current-generation 800G solutions continue to see broad adoption across multiple GPU and XPU platforms.

Marvell also noted growing traction in active electrical cable (AEC) solutions, where it has begun shipping 100G and 200G per-lane DSPs. This business, already approaching $100 million in annual revenue, is expected to at least double next year.

Scale-Up Networking: A Massive Emerging Opportunity

Looking further ahead, management highlighted the scale-up networking market—encompassing fabrics, switching, and optical connectivity—as a potential $16 billion-plus opportunity by 2030. Marvell believes it is well positioned to capture a meaningful share through a combination of in-house switching products and partnerships such as its work with Celestial on optical fabrics.

These solutions are expected to become increasingly important as hyperscalers move toward rack-scale and system-level AI architectures. Initial revenue contributions are anticipated to begin around calendar year 2027.

A Strong Multi-Year Outlook Despite Market Volatility

Overall, Marvell management expressed frustration with short-term market skepticism, noting that customer commitments, purchase orders, and product roadmaps all point to a robust multi-year growth outlook. With deep hyperscaler relationships, expanding silicon content, and leadership across both compute and networking, Marvell appears well positioned to benefit from the next phase of AI infrastructure build-outs.

For long-term investors, the message from management is clear: Marvell’s AI and data center growth story remains intact, increasingly diversified, and firmly grounded in multi-generational customer commitments.


Discover more from TEN-NOJI

Subscribe to get the latest posts sent to your email.

Leave a comment

Discover more from TEN-NOJI

Subscribe now to keep reading and get access to the full archive.

Continue reading