215 Views

Ampere and Qualcomm Collaborate on 3nm 256-Core AI Chip

LinkedIn Facebook X
May 17, 2024

Get a Price Quote

Ampere, a leading technology company, has recently unveiled a groundbreaking 3nm variant of its data centre AI chip. This new chip boasts an impressive 256 cores and represents a significant leap forward in AI processing capabilities. In collaboration with Qualcomm, Ampere is also working on advancing AI inference technology, utilizing the Qualcomm Cloud AI 100 inference solutions alongside Ampere CPUs.

The latest Ampere 3nm 256 core variant is designed to utilize the same air-cooled thermal solutions as the existing 192 core AmpereOne CPU. Despite this, it promises to deliver over 40% more performance than any other CPU currently available on the market, all without the need for complex platform designs. Ampere's highly anticipated 192-core 12-channel memory platform is still on track for release later this year.

"We embarked on this journey six years ago because we firmly believed it was the right direction to take," stated Renee James, the CEO of Ampere. "Historically, low power consumption was associated with low performance. However, Ampere has shattered that misconception. We have pushed the boundaries of computing efficiency and delivered performance that surpasses traditional CPUs within an energy-efficient framework."

James further emphasized the importance of sustainability in data centre infrastructure, highlighting the need to retrofit existing air-cooled environments with upgraded compute solutions and to construct new data centres that align with environmental sustainability goals. Ampere aims to facilitate these advancements through its innovative technologies.

Ampere advocates for a versatile CPU approach to accommodate various workloads, diverging from the reliance on high-power GPUs. This strategy mirrors the direction taken by Tachyum with its Prodigy universal processor. Jeff Wittich, Chief Product Officer at Ampere, explained, "Our Ampere CPUs are capable of running a wide range of workloads, from popular cloud-native applications to AI tasks. This includes seamlessly integrating AI functionalities with traditional cloud applications like data processing, web serving, and media delivery."

Recent Stories