Copyright marketwatch

Qualcomm’s The following AI250 is slated to roll out in 2027, and will include “an innovative memory architecture based on near-memory computing,” Qualcomm said, which will boost AI inference efficiency and performance “by delivering greater than 10x higher effective memory bandwidth,” and by lowering power consumption. Read: The surprising stocks leading the tech sector this year thanks to an AI renaissance The stock is up 12.4% in midday trading and tracking toward its highest close since July 23, 2024, when it finished at $193.35, according to Dow Jones Market Data. The company’s market cap was up $22.8 billion on Monday, putting it on pace for its largest one-day market cap gain since December 1999, the data showed. The rack solutions for both AI200 and AI250 include direct liquid cooling; the Peripheral Component Interconnect Express, or PCIe, standard for high-speed interface connection for scaling up; Ethernet for scaling out; and “confidential computing” to support running AI models securely, Qualcomm said. Additionally, Qualcomm said Saudi Arabian AI firm Humain will be the first customer to deploy its AI200 and AI250 rack solutions starting next year, with a goal of 200 megawatts’ worth of systems. Bernstein’s Stacy Rasgon said in a Monday note that Qualcomm is not new to selling AI accelerators, having announced its first generation in 2019. “These new parts appear to be next-gen, and are the first to bring a rack-scale architecture to the company’s offering,” Rasgon said about the AI accelerator cards and racks. “For whatever reason the company’s earlier efforts drew little to no attention but today’s announcement appears to be garnering more,” he added, pointing to the stock’s move, which he said is “significantly outperforming even on a strong market day.” Still, it’s currently difficult to gauge the value proposition and opportunity for Qualcomm’s new AI inference offerings, according to the Bernstein team. Don’t miss: How quantum computing could become the next frontier in national security With rack-level power consumption of 160 kilowatts, Humain’s plan for 200 megawatts of Qualcomm’s systems translates to about 1,250 racks, Rasgon said, noting that it’s not clear how many AI200 cards will be in each rack, or how much each rack costs. Rasgon said that Qualcomm’s 200 megawatt deal with Humain “is dwarfed by many of the other deals” such as those between AI startup OpenAI with Nvidia Corp. Qualcomm’s new data-center offerings put it in closer competition with Nvidia and AMD, which are facing increasing threats from custom chips developed by Broadcom Inc. More from MarketWatch: 5 bubble-resistant tech stocks to guard your portfolio from an AI crash