SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Semi Equipment Analysis
SOXX 299.81+2.7%Dec 19 4:00 PM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
Recommended by:
Julius Wong
Kirk ©
Return to Sender
From: Sam10/5/2025 7:22:26 PM
3 Recommendations   of 95617
 
Microsoft Shifts to Custom Chips for AI, Reducing Nvidia and AMD Reliance

Microsoft is shifting to predominantly use its own custom chips, like Azure Maia accelerators, for AI workloads in data centers, reducing reliance on Nvidia and AMD to control costs and optimize performance. This vertical integration strategy challenges industry rivals and could reshape AI infrastructure dynamics.
Written by Ava Callegari
Saturday, October 4, 2025

In a bold pivot that could reshape the semiconductor industry, Microsoft is charting a course toward self-reliance in artificial intelligence hardware. The tech giant’s chief technology officer, Kevin Scott, revealed during a recent appearance at Italian Tech Week that the company aims to predominantly use its own custom-designed chips for powering AI workloads in its data centers. This move signals a strategic departure from heavy dependence on external suppliers like Nvidia and AMD, driven by the need to control costs, optimize performance, and streamline system integration.

Scott emphasized that Microsoft’s ambition isn’t rooted in ideology but in practical necessities. With the explosive growth of AI demanding unprecedented computational power, the company has been quietly developing its Azure Maia accelerators and Cobalt processors. These in-house solutions are tailored specifically for the demands of training and running large language models, potentially offering Microsoft greater flexibility in scaling its Azure cloud services.

continues at webpronews.com
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext