Qualcomm Readies Rack-Scale AI With New Chips And Roadmap
Qualcomm Readies Rack-Scale AI With New Chips And Roadmap
Homepage   /    technology   /    Qualcomm Readies Rack-Scale AI With New Chips And Roadmap

Qualcomm Readies Rack-Scale AI With New Chips And Roadmap

Contributor,Karl Freund,Matthias Balk 🕒︎ 2025-10-29

Copyright forbes

Qualcomm Readies Rack-Scale AI With New Chips And Roadmap

Qualcomm Logo in Munich (Bavaria) (Photo by Matthias Balk/picture alliance via Getty Images) dpa/picture alliance via Getty Images Qualcomm announcement omits many details, but Humain will be a key customer. Qualcomm has quietly been in the AI data center since the company introduced the Qualcomm AI100 in 2019. Now, Qualcomm has taken a big step forward, launching a data center AI chip, PCIe cards and integrated racks for energy-efficient inference processing at scale. While the company omitted many details like expected performance and memory technology and did not (yet) include NVLink Fusion for scale-up, the company did announce a yearly cadence. Sound familiar? Its should, as the industry has evolved to demand rack-scale computing to handle the ever-increasing complexity of inference processing and agentic AI. (Like many AI semiconductor firms, Qualcomm is a client of Cambrian-AI Research. Let’s dive in, see what they announced, and speculate on what they didn’t. The AI 200 and AI250 The real story here is memory capacity for large-scale AI inference processing. Most AI chips today enjoy the performance that high bandwidth memory provides, and is an absolute requirement for training large language models. However, the relatively high cost and lower capacity of this type of memory is creating an opportunity for memory like DRAM (super large) and SRAM (super fast) for inference processing. And inference is the faster growing segment now. The AI 200 and 250 accelerators will be available in 2025 and 227 respectively. The first chip (the AI200) will be available some time in 2026, and the second chip will come in 2027. The AI200 rack, available in 2026, supports an impressive 768 GB of memory per PCIe card. MORE FOR YOU Its a bit difficult to handicap the rack performance as Qualcomm did not say how many chips will be in the rack, nor did they say what kind of performance is expected per chip. They gave a total memory footprint of 768 Gigabytes per card, and that the 160kW rack would be liquid cooled and use PCIe to scale up. The AI100 consumes 150 watts with four chips and 576 MB of on-chip SRAM and 128 GB of LPDDR4x DRAM memory and PCIe, so it is not too much of a stretch to guess that the AI200 rack could contain as many as 1000 chips. The AI250 rack, expected in 2027, will offer a new memory architecture using near-memory computing. The mystery here is how Qualcomm will increase memory bandwidth by te-fold with the AI250 in 2027 with the same capacity as the AI200. My guess is that it use more SRAM to achieve this feat. As mentioned above, Nvidia NVLink will not yet be supported, but that could come in 2028 in the new annual release cadence. Software Stack The AI Stack for data centers is critical to support developers and early adopters of the AI200 archtecture, and includes most applications, runtime support, and systenm software. One area that Qualcomm needs to address is the run-time orchestration needed to optimize a large-scale deployment. Since Nvidia’s Dynamo is an open-source platform, it is highly likely that the development team at Qualcomm will be able to address this need rather quickly. The Qualcomm AI Stack offers a fairly complete solution set for AI developers and deployments. Annual Product Cadence Points To A Significant Investment Cranking out a new chip every year requires of hundreds of millions of dollars annually. So Qualcomm is massively stepping up their investment in data center AI. And their ambitions are quite high with this aggressive inference design. We will update this story as more details are released next year. Editorial StandardsReprints & Permissions

Guess You Like

Neman: Book by WashU professor says tacos are more than tacos
Neman: Book by WashU professor says tacos are more than tacos
Daniel Neman | Post-Dispatch F...
2025-10-21
Trucks drivers are ready to embrace In-Cab AI coaching
Trucks drivers are ready to embrace In-Cab AI coaching
Professional truck drivers hav...
2025-10-22