384 Views

AMD strategy for future compute and AI leadership

LinkedIn Facebook X
November 18, 2025

Get a Price Quote

AMD recently held its 2025 Financial Analyst Day in New York, where it unveiled its long-term strategy for future compute and AI. The company provided insights into new product roadmaps and multi-year financial targets, emphasizing its strong momentum across CPUs, GPUs, and adaptive computing platforms as it gears up for the next phase of growth.

For readers of eeNews Europe, these updates signify significant shifts in the architectures and components that will shape the design of future data-center and AI systems.

Expanding data center and AI leadership

Dr. Lisa Su, chair and CEO of AMD, set the stage by stating, “AMD is entering a new era of growth driven by our leading technology roadmaps and accelerating AI momentum. With the widest range of products and deepening strategic partnerships, AMD is uniquely positioned to lead the next generation of high-performance and AI computing. We see a tremendous opportunity ahead to deliver sustainable, industry-leading growth. We have never been in a better position.”

On the GPU front, AMD reported record adoption of its Instinct MI350 Series accelerators, which are already in use at scale by major hyperscalers like Oracle Cloud Infrastructure. The upcoming “Helios” systems, powered by MI450 GPUs, are set to hit the market in Q3 2026, boasting top-tier rack-scale performance, memory capacity, and bandwidth. A subsequent MI500 family is slated for 2027.

The momentum in CPUs continues as AMD capitalizes on the performance and efficiency enhancements of EPYC processors to gain traction in cloud and enterprise settings. The next-generation “Venice” server CPUs are designed to meet the rising demand for AI-driven infrastructure with increased density and enhanced energy efficiency.

Networking remains a key focus for AMD, with a spotlight on Pensando Pollara and its upcoming “Vulcano” AI NICs, engineered to provide high bandwidth and standards-based flexibility for large-scale AI clusters.

Client, gaming, and adaptive computing momentum

AMD has expanded its AI PC lineup by 2.5 times since 2024, with Ryzen chips now driving more than 250 notebook and desktop platforms and seeing adoption in over half of the Fortune 100 companies. The next-generation “Gorgon” and “Medusa” processors are expected to deliver up to 10 times the AI performance compared to hardware from 2024.

In the embedded and adaptive computing segment, which encompasses FPGAs, embedded x86, and semi-custom silicon, AMD continues to build on over $50 billion in design wins since 2022. The company believes it is well positioned to seize AI-driven growth from the cloud to the edge, while also expanding opportunities in semi-custom and physical AI.

A transformative financial model

AMD is aiming for sustained high growth across all segments, with projections of more than 60% revenue compound annual growth rate (CAGR) in its data center business and over 10% CAGR in Embedded, Client, and Gaming. The company is on course to surpass 50% server CPU revenue share, achieve more than 80% CAGR in data center AI, and secure over 40% client CPU revenue share.

The technology roadmap includes ongoing chiplet and packaging innovations, extended CPU/GPU/NPU roadmaps, and the introduction of 5th-gen Infinity Fabric for large-scale AI systems.

With a growing portfolio and ambitious market-share targets, AMD is positioning itself as a major player influencing the development of next-generation high-performance and AI-centric compute architectures.

AMD

Recent Stories


Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.