SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Strategies & Market Trends : Technical analysis for shorts & longs -- Ignore unavailable to you. Want to Upgrade?


To: Johnny Canuck who wrote (67260)10/27/2025 1:58:50 PM
From: Johnny Canuck  Read Replies (1) | Respond to of 67444
 
Tech
Qualcomm announces AI chips to compete with AMD and Nvidia — stock soars 15%
Published Mon, Oct 27 202510:00 AM EDTUpdated 3 Hours Ago


Kif Leswing @kifleswing

WATCH LIVE

Key Points

  • Qualcomm announced that it will release new AI accelerator chips.
  • Nvidia has dominated the market for AI chips, with AMD seen as the second-place player.
  • Google, Amazon, Microsoft and OpenAI are also developing their own AI accelerators for their cloud services.


In this article





watch now

VIDEO03:10
Qualcomm announces new data center AI chips to target AI inference

Qualcomm

announced Monday that it will release new artificial intelligence accelerator chips, marking new competition for Nvidia, which has so far dominated the market for AI semiconductors.

The stock soared 15% following the news.

The AI chips are a shift from Qualcomm, which has thus far focused on semiconductors for wireless connectivity and mobile devices, not massive data centers.

Qualcomm said that both the AI200, which will go on sale in 2026, and the AI250, planned for 2027, can come in a system that fills up a full, liquid-cooled server rack.

Qualcomm is matching Nvidia and AMD

, which offer their graphics processing units, or GPUs, in full-rack systems that allow as many as 72 chips to act as one computer. AI labs need that computing power to run the most advanced models.

Qualcomm’s data center chips are based on the AI parts in Qualcomm’s smartphone chips called Hexagon neural processing units, or NPUs.

“We first wanted to prove ourselves in other domains, and once we built our strength over there, it was pretty easy for us to go up a notch into the data center level,” Durga Malladi, Qualcomm’s general manager for data center and edge, said on a call with reporters last week.

The entry of Qualcomm into the data center world marks new competition in the fastest-growing market in technology: equipment for new AI-focused server farms.

Nearly $6.7 trillion in capital expenditures will be spent on data centers through 2030, with the majority going to systems based around AI chips, according to a McKinsey estimate.

Read more CNBC tech news


The industry has been dominated by Nvidia, whose GPUs have over 90% of the market so far and sales of which have driven the company to a market cap of over $4.5 trillion. Nvidia’s chips were used to train OpenAI’s GPTs, the large language models used in ChatGPT.

But companies such as OpenAI have been looking for alternatives, and earlier this month the startup announced plans to buy chips from the second-place GPU maker, AMD, and potentially take a stake in the company. Other companies, such as Google

, Amazon and Microsoft
, are also developing their own AI accelerators for their cloud services.

Qualcomm said its chips are focusing on inference, or running AI models, instead of training, which is how labs such as OpenAI create new AI capabilities by processing terabytes of data.

The chipmaker said that its rack-scale systems would ultimately cost less to operate for customers such as cloud service providers, and that a rack uses 160 kilowatts, which is comparable to the high power draw from some Nvidia GPU racks.

Malladi said Qualcomm would also sell its AI chips and other parts separately, especially for clients such as hyperscalers that prefer to design their own racks. He said other AI chip companies, such as Nvidia or AMD, could even become clients for some of Qualcomm’s data center parts, such as its central processing unit, or CPU.

“What we have tried to do is make sure that our customers are in a position to either take all of it or say, ‘I’m going to mix and match,’” Malladi said.

The company declined to comment, the price of the chips, cards or rack, and how many NPUs could be installed in a single rack. In May, Qualcomm announced a partnership with Saudi Arabia’s Humain to supply data centers in the region with AI inferencing chips, and it will be Qualcomm’s customer, committing to deploy up to as many systems as can use 200 megawatts of power.

Qualcomm said its AI chips have advantages over other accelerators in terms of power consumption, cost of ownership, and a new approach to the way memory is handled. It said its AI cards support 768 gigabytes of memory, which is higher than offerings from Nvidia and AMD.

Qualcomm’s design for an AI server called AI200.
Qualcomm



Qualcomm Inc
QCOM:NASDAQ

189.77+20.83 (+12.33%)
Last | 1:58 PM EDT

Qualcomm one day stock chart.

In this article





  • News TipsGot a confidential news tip? We want to hear from you.

    Get In Touch
  • CNBC NewslettersSign up for free newsletters and get more CNBC delivered to your inbox

    Sign Up NowGet this delivered to your inbox, and more info about our products and services.

  • Advertise With Us Please Contact Us
© 2025 CNBC LLC. All Rights Reserved. A Division of NBCUniversal

Data is a real-time snapshot *Data is delayed at least 15 minutes. Global Business and Financial News, Stock Quotes, and Market Data and Analysis.

Market Data Terms of Use and DisclaimersData also provided by



To: Johnny Canuck who wrote (67260)10/27/2025 2:01:47 PM
From: E_K_S  Read Replies (3) | Respond to of 67444
 
I still hold a chunk of LUMN in the ROTH from 2022 Buys; avg cost at $11.24. I owned it for their dividend but that was stopped November 2022