SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Qualcomm Moderated Thread - please read rules before posting
QCOM 181.05-3.5%Oct 28 3:59 PM EDT

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Bill Wolf who wrote (196175)10/27/2025 10:23:37 AM
From: Bill Wolf5 Recommendations

Recommended By
Art Bechhoefer
GR8FORM
manning18
matherandlowell
VinnieBagOfDonuts

  Read Replies (2) of 196287
 
Qualcomm Debuts Chip to Rival Nvidia in AI Accelerator Market
By Ian King
October 27, 2025 at 10:00 AM EDT
Updated on October 27, 2025 at 10:15 AM EDT

Qualcomm Inc., the top maker of mobile phone processors, is entering the lucrative AI data center market with new chips and computers, aiming to challenge Nvidia Corp. in the fastest-growing part of the industry.

The AI200 will start shipping next year and be offered as either a standalone component, cards that can be added into existing machines or as part of a full rack of servers provided by Qualcomm. The first customer for the offerings will be Saudi Arabia’s AI startup Humain, which plans to deploy 200 megawatts of computing based on the new chips starting in 2026.

Those debut products will be followed by the AI250 in 2027, the San Diego company said Monday. If supplied only as a chip, the component could work inside gear that’s based on processors from Nvidia or other rivals. As a full server, it will compete with offerings from those chipmakers.

Shares of Qualcomm jumped as much as 15% in New York, their biggest intraday gain in more than six months.

Qualcomm is trying to break into an area that’s reshaped the semiconductor industry, with hundreds of billions of dollars being spent on data centers to power artificial intelligence software and services. The company said new memory-related capabilities and the power efficiency of designs that owe their roots to mobile device technology will attract customers, despite Qualcomm’s relatively late entry.

The new offerings are built around a neural processing unit, a type of chip that debuted in smartphones and is designed to speed up AI-related workloads without killing battery life. That capability has been developed further through Qualcomm’s move into laptop chips and has now been scaled up for use in the most powerful computers.

Under Chief Executive Officer Cristiano Amon, Qualcomm is trying to diversify away from its dependence on smartphones, which are no longer increasing sales as quickly as they once did. The company has branched out into chips for cars and PCs, but is only now offering a product in what’s become the biggest single market for processors.

Qualcomm has been “quiet in this space, taking its time and building its strength,” according to Durga Malladi, a company senior vice president. Qualcomm is in talks with all of the biggest buyers of such chips on deploying server racks based on its hardware, he said.

Winning orders from companies such as Microsoft Corp., Amazon.com Inc. and Meta Platforms Inc. would offer a significant new revenue source for Qualcomm. The company has posted solid, profitable growth over the last two years, but investors have favored other tech stocks. Qualcomm shares have gained 10% in 2025, lagging behind a 40% surge by the Philadelphia Stock Exchange Semiconductor Index.

Nvidia, which remains atop the AI computing world, is on target to generate more than $180 billion in revenue from its data center unit this year, more than any other chipmaker – including Qualcomm – will get in total, according to estimates.

Qualcomm’s new chips will offer an unprecedented amount of memory, according to Malladi. They’ll have as many as 768 gigabytes of low-power dynamic random access memory.

Memory access speed and capacity are crucial to the rate at which the chips can make sense of the mountain of data handled in AI work. The new chips and computers are aimed at inference work, the computing behind running AI services once the large language models that underpin the software have been trained.

(Updates with share reaction in the fourth paragraph.)

bloomberg.com
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext