| | | Microsoft Shifts to Custom Chips for AI, Reducing Nvidia and AMD Reliance
Microsoft is shifting to predominantly use its own custom chips, like Azure Maia accelerators, for AI workloads in data centers, reducing reliance on Nvidia and AMD to control costs and optimize performance. This vertical integration strategy challenges industry rivals and could reshape AI infrastructure dynamics. Written by Ava Callegari Saturday, October 4, 2025
In a bold pivot that could reshape the semiconductor industry, Microsoft is charting a course toward self-reliance in artificial intelligence hardware. The tech giant’s chief technology officer, Kevin Scott, revealed during a recent appearance at Italian Tech Week that the company aims to predominantly use its own custom-designed chips for powering AI workloads in its data centers. This move signals a strategic departure from heavy dependence on external suppliers like Nvidia and AMD, driven by the need to control costs, optimize performance, and streamline system integration.
Scott emphasized that Microsoft’s ambition isn’t rooted in ideology but in practical necessities. With the explosive growth of AI demanding unprecedented computational power, the company has been quietly developing its Azure Maia accelerators and Cobalt processors. These in-house solutions are tailored specifically for the demands of training and running large language models, potentially offering Microsoft greater flexibility in scaling its Azure cloud services.
continues at webpronews.com |
|