French article - October 28, 2025 << The next generation, the AI250 expected in 2027, promises even more significant advances thanks to a near-memory computing architecture. This innovation is expected to increase effective memory bandwidth tenfold while significantly reducing power consumption. The AI250 will also introduce disaggregated compute capacity – via CXL – allowing compute and memory resources to be dynamically shared between CPU cards.>>
<< Scott Young, senior director of consulting services at Info-Tech Research Group, pointed out that learning is related to computing power and tolerant of time, while inference is related to memory and/or latency and "presumably continuous." These two processes require different types of servers, he noted. Learning materials are expensive, and while they can be used for inference, they are not always optimal. The demand for inference will continue to grow as companies deploy more agentic AI, he said. "Hardware specifically designed for inference can be more cost-effective and can be adapted appropriately to the rapid growth in the use of these AI models," Young explained.>> << This seems like a natural step, as Qualcomm has found success with its AI100 accelerator, a powerful inference chip, he explained. At present, Nvidia and AMD dominate the training market, with Cuda and ROCm enjoying customer loyalty. "If I'm a semiconductor giant like Qualcomm, who understands the balance between performance and power so well, this inference market makes perfect sense," Kimball said. He also highlighted the company's plans to return to the data center processor market with its Oryon processor, which powers the Snapdragon and is loosely inspired by the technology acquired from its $1.4 billion acquisition of Nuvia. Ultimately, Qualcomm's decision demonstrates how open the inference market is, the analyst said. He stressed that the company had always been able to choose its target markets wisely and had been successful in entering these markets. "It makes sense for the company to decide to get more involved in the inference market," said Kimball. He added that, from a ROI perspective, inference will "eclipse" training in terms of volume and dollars.>> With its AI200 and AI250 racks, Qualcomm challenges Nvidia in the AI market - Le Monde Informatique
|