SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Qualcomm Moderated Thread - please read rules before posting -- Ignore unavailable to you. Want to Upgrade?


To: Jim Mullens who wrote (196805)12/3/2025 11:34:05 AM
From: Jim Mullens  Respond to of 197001
 
Qualcomm Ignites AI Data Center Race with New Chips, Stock Soars

By: MarketMinute

October 27, 2025 at 11:11 AM EDT

Qualcomm Ignites AI Data Center Race with New Chips, Stock Soars

Qualcomm ( NASDAQ: QCOM) has made a definitive and impactful entry into the burgeoning artificial intelligence (AI) data center market, unveiling its next-generation AI inference-optimized solutions that sent its stock soaring by nearly 20% on October 27, 2025. This strategic pivot signals a robust challenge to established AI chip giants and marks a significant diversification beyond its traditional smartphone dominance, positioning the company as a formidable player in the rapidly expanding AI infrastructure sector. The market's enthusiastic reaction underscores investor confidence in Qualcomm's ability to capture a substantial share of this critical and lucrative new frontier.

The announcement, which saw Qualcomm's shares close up 19.2% at $201.41, comes as the company officially introduced its AI200 and AI250 chip-based accelerator cards and racks. These cutting-edge solutions are engineered to accelerate large language models (LLMs) and multimodal models (LMMs) within enterprise and cloud environments, specifically targeting the computationally intensive AI inference workloads. The move is a clear statement of intent, indicating Qualcomm's ambition to be a leader in the infrastructure powering the global AI revolution.

Qualcomm's Bold Leap into AI Infrastructure

Qualcomm's foray into the AI data center market is meticulously planned, with a comprehensive product roadmap and strategic partnerships underpinning its ambitious goals. On October 27, 2025, the company pulled back the curtain on its new Qualcomm AI200 and Qualcomm AI250 chips. The AI200 is slated for commercial availability in 2026, while the more advanced AI250 is expected to follow in 2027. The AI250, in particular, boasts an innovative near-memory computing architecture, which Qualcomm claims delivers over ten times the effective memory bandwidth of competing solutions, all while consuming significantly less power—a critical factor in data center operations.

These rack-scale solutions are designed for high performance per dollar per watt, aiming to provide a low total cost of ownership (TCO) for clients. They incorporate advanced features such as direct liquid cooling for thermal efficiency, PCIe for scale-up capabilities, and Ethernet for scale-out, alongside confidential computing for secure AI workloads. Power consumption at the rack level is optimized at approximately 160 kW. Crucially, Qualcomm is also providing a robust software stack, ensuring compatibility with major AI frameworks and offering one-click deployment for Hugging Face models, facilitating frictionless adoption for developers and enterprises. The company has committed to a multi-generation roadmap, promising annual releases of new data center AI inference products.

A pivotal strategic partnership has also been forged with HUMAIN, an AI startup based in Saudi Arabia, which Qualcomm announced as its first customer for the new AI data center systems. HUMAIN plans to deploy a staggering 200 megawatts of computing power utilizing Qualcomm's AI systems, with initial deployments anticipated to commence in 2026. This collaboration builds upon a Memorandum of Understanding (MOU) signed in May 2025 during the Saudi-US Investment Forum, aimed at developing advanced AI data centers and hybrid cloud-to-edge AI services within Saudi Arabia and internationally. As part of this alliance, Qualcomm will also work with Saudi Arabia's Ministry of Communications and Information Technology to establish a design center, fostering the local semiconductor ecosystem and nurturing local engineering talent. The market's immediate reaction was overwhelmingly positive, with Qualcomm's stock ( NASDAQ: QCOM) surging by as much as 19% in morning trading on the day of the announcement, ultimately closing up 19.2% and pushing its market capitalization to nearly $182 billion. This marked its strongest single-day performance of 2025, reflecting significant investor optimism about the company's strategic expansion into the highly lucrative and rapidly growing AI infrastructure sector.

Shifting Sands: Winners and Losers in the AI Chip Arena

Qualcomm's ( NASDAQ: QCOM) assertive entry into the AI data center market undoubtedly positions it as a significant winner, at least in terms of market perception and immediate stock performance. The company's strategic diversification from its core smartphone business into high-growth areas like AI infrastructure is a critical move for its long-term health and growth trajectory. By offering solutions that promise high performance, energy efficiency, and a competitive TCO, Qualcomm aims to carve out a substantial niche and capture new revenue streams. The partnership with HUMAIN provides a strong initial deployment and validation for its new systems, establishing a foundational customer for its data center ambitions.

Conversely, established leaders in the AI chip space, particularly Nvidia ( NASDAQ: NVDA) and, to a lesser extent, AMD ( NASDAQ: AMD), face increased competitive pressure. Nvidia, which currently dominates the AI data center market with its GPUs, saw its shares reportedly experience a slight dip of 2% on the day of Qualcomm's announcement. This immediate market reaction highlights the perceived threat that Qualcomm's new offerings pose. While Nvidia's ecosystem and performance remain robust, Qualcomm's focus on AI inference with specialized, power-efficient chips could appeal to customers looking for alternatives or more optimized solutions for specific workloads, potentially eroding some of Nvidia's market share over time. AMD, while also a player in the data center CPU and GPU market, will similarly need to continue innovating to maintain its competitive edge against this new entrant.

Beyond the direct chip manufacturers, other companies within the data center ecosystem could experience ripple effects. Cloud service providers and enterprises deploying AI workloads might benefit from increased competition, leading to more diverse and potentially more cost-effective hardware options. This could spur innovation across the board, benefiting the broader AI industry by making advanced AI capabilities more accessible. Software developers and AI model providers could also gain from a wider array of optimized hardware, enabling them to deploy and scale their solutions more efficiently across different platforms.

The Broader Implications of Qualcomm's AI Push

Qualcomm's entry into the AI data center market is more than just a product launch; it represents a significant inflection point in the broader technology landscape. This event fits squarely into the overarching industry trend of diversification and specialization within the semiconductor sector. As AI becomes ubiquitous, companies are moving beyond general-purpose computing to develop highly optimized hardware tailored for specific AI workloads, such as inference. Qualcomm's strategy aligns with this by focusing on efficiency and performance for generative AI inference, aiming to differentiate itself from GPU-centric solutions.

The potential ripple effects on competitors are substantial. Qualcomm's emphasis on a comprehensive software stack and ease of deployment directly challenges the strong ecosystem built by Nvidia. This could force Nvidia to further accelerate its own innovation in inference-specific hardware and software, potentially benefiting customers with faster technological advancements. For partners, such as cloud providers and system integrators, Qualcomm's new offerings present opportunities to build new AI infrastructure solutions, potentially fostering new collaborations and market segments. The establishment of a design center in Saudi Arabia, driven by the HUMAIN partnership, also underscores the geopolitical and economic significance of AI infrastructure, highlighting the global race to build AI capabilities.

Historically, the semiconductor industry has seen cycles of dominance and challenge. Just as Intel once dominated the PC processor market before AMD's resurgence, and Nvidia established its lead in GPUs, new architectures and specialized solutions consistently emerge to disrupt the status quo. Qualcomm's move can be compared to Intel's ( NASDAQ: INTC) past attempts to enter mobile, or AMD's strategic focus on server CPUs to challenge Intel. This event signals a maturation of the AI hardware market, moving beyond early-stage GPU-only solutions towards a more diverse and competitive landscape where specialized silicon for inference, combined with a robust software ecosystem, will be key differentiators. Regulatory or policy implications could arise if one company gains too much dominance, but for now, increased competition is likely to be viewed positively by regulators as it fosters innovation and potentially lowers costs for consumers of AI services.

What Comes Next: The Road Ahead for Qualcomm and AI

The immediate future for Qualcomm's ( NASDAQ: QCOM) AI data center ambitions will revolve around the successful commercialization and deployment of its AI200 and AI250 chips. With the AI200 slated for 2026 and the AI250 for 2027, the company will need to demonstrate flawless execution in manufacturing, supply chain management, and customer support. The initial deployment with HUMAIN in Saudi Arabia, expected to commence in 2026, will serve as a crucial real-world test and a showcase for the capabilities of Qualcomm's new systems. Successful deployments will be vital for building credibility and attracting more customers.

In the short term, investors will be closely watching for further customer announcements, progress reports on the HUMAIN deployment, and any updates on the performance benchmarks of the AI200 and AI250 in comparison to rival solutions. Qualcomm's multi-generation roadmap, promising annual releases, indicates a sustained commitment to this market, suggesting a continuous stream of innovation and product enhancements. This will necessitate ongoing investment in R&D and strategic talent acquisition to maintain a competitive edge.

Long-term possibilities include Qualcomm establishing itself as a dominant force in AI inference, potentially leading to a more balanced market share distribution among AI chip providers. This could open up new strategic pivots, such as expanding into training workloads or developing more integrated AI-on-chip solutions that span from edge devices to the cloud. Market opportunities will emerge for ecosystem partners, including software developers, system integrators, and data center operators, who can leverage Qualcomm's new hardware. Challenges will include intense competition, the rapid pace of AI innovation requiring constant adaptation, and potential supply chain complexities. Potential scenarios range from Qualcomm becoming a major, enduring player in AI data centers to facing significant headwinds if its solutions fail to gain widespread adoption or if competitors innovate faster.

A New Era in AI: Qualcomm's Enduring Impact

Qualcomm's ( NASDAQ: QCOM) strategic entry into the AI data center market with its specialized AI200 and AI250 chips marks a pivotal moment, signaling a significant shift in the competitive landscape of AI infrastructure. The immediate and overwhelmingly positive reaction from the stock market, with shares surging nearly 20% on October 27, 2025, underscores a strong belief in Qualcomm's ability to diversify beyond its smartphone legacy and capture a substantial portion of the rapidly growing AI market. This move is not merely about new products; it's about a fundamental reorientation of Qualcomm's strategic priorities towards high-growth, high-value segments of the technology industry.

Moving forward, the market will be closely assessing Qualcomm's execution capabilities, particularly regarding the commercial availability of its new chips in 2026 and 2027, and the success of its foundational partnership with HUMAIN. The company's commitment to a robust software stack and a multi-generation product roadmap suggests a sustained and serious effort to challenge incumbents like Nvidia ( NASDAQ: NVDA) and AMD ( NASDAQ: AMD). This increased competition is ultimately beneficial for the wider AI industry, promising accelerated innovation, potentially lower costs, and a broader array of optimized hardware solutions for enterprises and cloud providers.

Investors should watch for key metrics such as initial customer adoption rates, the performance benchmarks of the AI200 and AI250 in real-world deployments, and any further strategic partnerships that Qualcomm might announce. The success of its design center initiative in Saudi Arabia will also be a telling indicator of its global expansion strategy. Qualcomm's bold pivot has set the stage for a new era of competition and innovation in AI data centers, and its lasting impact could reshape the future of artificial intelligence infrastructure for years to come.



To: Jim Mullens who wrote (196805)12/3/2025 12:37:26 PM
From: GR8FORM3 Recommendations

Recommended By
Dr. John
dylan murphy
Jim Mullens

  Read Replies (1) | Respond to of 197001
 
Transcript: QUALCOMM Incorporated (QCOM) Presents at UBS Global Technology and AI Conference 2025 Transcript
SA Transcripts
Okay. We're going to get started. Good afternoon. I'm Tim Arcuri. I'm the semiconductor and semi equipment analyst here at UBS. We're very pleased to have Qualcomm, we're very pleased to have Cristiano Amon from Qualcomm, who's the President and the CEO. So thank you, Cristiano.

Question-and-Answer Session

Timothy Arcuri
UBS Investment Bank, Research Division

Great. So let's start talking about the business that everyone wants to know about, which is your data center business.

Cristiano Amon
CEO, President & Director

I'm surprised.

Timothy Arcuri
UBS Investment Bank, Research Division

Yes. So you're attacking the low-power inference market. You announced the AI 200 and AI 250. You still haven't told us very much about the specs and the road map. What can you say?

Cristiano Amon
CEO, President & Director

Okay. Look, maybe let me start with the very top and I'll walk to some of the details. I think we look at the market, we look at what's happening with AI. And I think I'm assuming is the hope of everybody in the room that eventually, you go from training to very large-scale inference and you start doing inference, you're putting AI to work and you have a lot of customers.

And I think what's happening is we see this as a one of the entry point for us, as AI is going to go into inference, you're going to be able to build large inference-focused clusters, the data center is going to go to the next phase of this aggregation. You're going to have dedicated hardware tooling inference. And we think that creates an opportunity, I think, for us to enter. We expect there's going to be competition, everybody is playing to win.

And at some point, I think tokens per dollar will matter, tokens per watt will matter. And we have an opportunity to come up with something that is very competitive for inference. The second, I think data point, I think to drive the entry is how we see AI evolving.

Eventually, it's been kind of the as you put this into work and you started to see the combination of mix of experts and distillation, chain of reasoning. You see that the AI is really becoming mature to the point that it's not going to be farfetched that you're going to have small appliances that it could be doing multiple hundreds of billions of parameters of model.

If anything, you look at NVIDIA DGX that's kind of what we do, and that trend will continue. And I think you're going to end up in a situation that architectures are going to be available for inference in the data center is going to be competing, I think, for the efficient architecture. With those 2 things in mind, we look into some of the assets that we can leverage in the company.

And we have 2 assets, and that's what we'll be focused on. 1 is we're doing a CPU, which is the head node of an inference cluster. And the other one, which is the largest opportunity is leveraging on our NPU architecture, which we believe has very high computer density, a completely different approach to how we think about the compute and memory, I think, together and thinking about developing a very efficient inference solution.

The good thing about Qualcomm, we don't have to get a lot. I think we only need to get a small portion of this very large tenant is very significant for the company. We also like the fact that a lot of the market is concentrated. You have a few customers that buy at scale. The different thing is the market welcomes competition. Qualcomm is not a small semiconductor company. We can do things at scale. I think we have like a proven track record of executing in a number of different industry at scale.

And I think the market has been very intrigued about what we're doing. We're doing something different. We're thinking about the next disaggregation of the data center. And we're excited about it. We will unveil details of the road map, both the AI 200, the AI 250, what we're going to be doing after the AI 250. We announced it a little bit prematurely because we had a customer who had to announce it.

I think the first customer that we have is 200 megawatts of data centerwith the Saudi national AI company. We're very pleased with the progress they made with the license and everything that's moving to execution. We are in conversations, as you would imagine, with all the hyperscalers. And we're very pleased, I think, with the feedback we're getting so far.

Timothy Arcuri
UBS Investment Bank, Research Division

Great. Is it fair to characterize so that 200 seems more evolutionary from what you currently have and the 250 seems more bottom-up purpose built, and that seems more ramping in [ 2020 ].

Cristiano Amon
CEO, President & Director

100% correct.

Timothy Arcuri
UBS Investment Bank, Research Division

Is that fair? And then I think you're trying to attack the decode portion of inference workloads. When I hear that, it sounds a lot like what NVIDIA is doing with CPX, Rubin CPX. Is there really a window of opportunity for you? Or is it tide just rising so fast that you think you can get some of that market as well?

Cristiano Amon
CEO, President & Director

Look, I think -- like I said, -- the interesting thing about the inference clusters is you ended this aggregation. I think as other, I think what's happened with the decode. I think when -- some of the discussions we have has been fascinating to see how much more performance that you get on how much you manage cooling or even like a few percentage increase in performance, how it changes, I think the total TCO.

So I think there's definitely an opportunity for Qualcomm. It is driven by all those things. This is moving very fast. There is going to be competition. People need more compute that they can deploy. I think the demand is real. So any improvement it makes a lot of sense.

And I think we have a good technology. I think just if you look at our track record, all of the new things that Qualcomm have done, even things that were new to scale, our IP is a leading IP. So why would you not bet the Qualcomm can do a competitive solution? I think that's kind of the feedback we're getting.

Timothy Arcuri
UBS Investment Bank, Research Division

Great. And relative to your financial model, this seems like it could be pretty significantly incremental actually?

Cristiano Amon
CEO, President & Director

Absolutely.

Timothy Arcuri
UBS Investment Bank, Research Division

Your $22 billion for fiscal '29, that's non-handset. I mean this is a huge market. So you don't have to get a very big share of it to be pretty incremental to your financial models?

Cristiano Amon
CEO, President & Director

I agree. It's 100% incremental, not model in our $22 billion of no handset. Those are for the other markets we've been executing right now. And we did say in the last earnings call, I think we feel confident to -- we historically have been very conservative with some of those assumptions. We feel comfortable pulling in by 1 year, where we originally set.

Timothy Arcuri
UBS Investment Bank, Research Division

And how does the Alphawave deal fit into the strategy here? Does it intersect the road map for 250, which ramps in 2028?

Cristiano Amon
CEO, President & Director

Yes. The Alphawave provides, I think, important, I think, connectivity IP that allow us to really scale the solution. So -- and they also have a team that has been doing custom SoCs for the data center. So I think it's both provide the scale, connectivity IP as well as additional resources to execute on the opportunities.

Timothy Arcuri
UBS Investment Bank, Research Division

Let's just talk about the general -- the adjacencies in general. These have been growing very, very nicely. They're up to 30% of revenue. Obviously, more if we exclude Apple, which of these efforts are you most excited about between auto, IoT, there's a lot of submarkets within IoT. But auto, IoT and PC, which of these are your most...

Cristiano Amon
CEO, President & Director

Look, we -- auto has been a great success story for the company. We continue to be bullish on it. I think when we look at the opportunities we have on the horizon, we see a lot of opportunities to even expand our design win pipeline, which is very robust. I think so far, you're seeing this pipeline converted into revenue. And I think we're tracking very well, but we see opportunities to expand the design win pipeline.

And that's happening because the same thing that we see on personal AI devices, which I'll talk to you in a second, we're going to see that happening in the digital cockpit of the car, a lot of GenAI and Gentech experience coming to the digital cockpit of the car that's going to increase the value and the silicon opportunity there. The other thing that we see is we're super pleased with the stack that we launched with BMW.

This is an OEM-friendly stack 3.5 years into making -- we have a lot of inbound for OEMs really interested now that they'll be able to see the KPIs, the cars are in the road. And I think that could create expansion opportunity for ADAS stack. And we like the ongoing transition to software-defined vehicles. I think the architecture of the car is going more towards center computing.

So Auto will continue to see growth of the design win pipeline, and we're very pleased. I think the Snapdragon digital chassis became like an industry platform. The second one is the one that I mentioned briefly personal AI devices. So I think the best way to describe this, and I -- this is a topic that we spend a lot of time thinking about this, the evolution of mobile.

Phones will continue to stay on this trajectory that they are right now. They're going to require more and more AI compute. Phones are not going anywhere. Phones are like laptops. Laptops continue even after we all bought smartphones, but phones will do a lot more processing, but there's a new class of devices that are going to drive agentic experience are going to be personal advice.

Glasses is the 1 that is the most promising, but there are others. I think you see all of those companies that have models designing different types of devices. The good thing we're designing all of them. And we think that's going to drive new agentic experience that could be a significant opportunity. And it can grow dramatically. We talk about $2 billion in our $22 billion projection for that we said in the last 2024 Investor Day, we said that $2 billion will be XR devices.

We're well ahead of it when you think about what's happening with personal devices and glasses. I'm very bullish on that opportunity. And I think this is going to play out like this. When the phone is at the center of our digital life right now. So all of the wearable devices, they have been around the phone. They extend the functionality of the phone, like a watch get the phone sensor data, give you back notifications. They actually, the first project we did with Meta was to -- for you to do Instagram stories, extend the functionality camera.

It doesn't matter. Now the agent is at the center. And those devices, they get better every month. Every month, they are new use cases. I think we see customers for example, even completely new markets, we have a customer in India, they are doing glasses. They integrate it with the national payment system. You can look at the QR code and you can pay a bill. And I think that's going to be -- those new agent experience are going to develop this category can be very, very big, and it's going to change a lot the relationship that we're going to see happening in the mobile industry.

The phone will continue to do phone things. The phone will do more processing for those devices. But then those devices are going to be developing around the model. Once you connect to a model, it doesn't matter how you connect to the model. The model will understand the human intentions, and to take action. And I think that's going to drive a combination of connectivity and processing of those new devices and it may change the dynamic of the industry.

And here, I'm going to make -- provide an opinion. Humans already decided what they're going to wear. Humans will wear bracelets, watches, jewelry pendants, glasses and if the model understand what we see, what we hear, what we say, it's closer to our senses. That's why glass is very natural. You turn your head, that's the camera is seeing what you're seeing, close to your mouth, close to your ears. And this is going to be a combination of technology and fashion, different countries, different regions, different brands that is more conducive to a horizontal platform than to a vertical platform.

So I think we're very optimistic about what Google is doing with Gemini, what Meta is doing and what OpenAI is doing. And this could be an interesting new category. So that's the second one, and I'm excited about it. And then the third one, I think we recently closed the acquisition of Arduino. We've been saying that we've been building a platform for industrial.

I think that's the ability to bring high-performance compute and AI to industrial replacing microcontrollers. We were building a development platform, Arduino and Edge Impulse are big drivers for that and just build more confidence on our $22 billion revenue.

Timothy Arcuri
UBS Investment Bank, Research Division

And so in that example, the model started on the phone. In your world, the models on your phone?

Cristiano Amon
CEO, President & Director

No. See, I know that there's a lot of questions like this. And before allow me to be, I think, very precise. Before we used to get a lot of questions about this, is a cloud or is edge. It's a cloud or it's edge. Now we get a lot of questions. Is the model running on the phone? Is the model running on the last, where is the compute is. Let me take a step back.

The smartphone today is the most cloud connected device in the world. If you put your phone in an airplane mode, you're not going to use it. You're going to be very frustrated. You're just going to put it back in your bag in our pocket. But having said that, there's a lot of processing that happens on the phone. I think you need to be thinking about this a little bit different.

First of all, the answer about models, where does the model understands what we see, what we hear and what we say are located. They are located in the device because latency is unforgiving. So anything that has to do with a user interface is located on the device. So we announced the first chip in a glass in a frame that does 1 billion parameter model. The next ship is going to do much more every company, things like voice to text, the ability to quickly annotate and image and do things, they're asking us, I needed to have that into the device.

At the same time, those devices are going to have to need a lot of connectivity because a lot of those things are going to be happening in the cloud. The other thing you're going to see is with mix of experts with chain of thought reasoning, you have smaller models, there are certain things and then the bigger model doing some other things. And you can easily see how this play out, whether you're looking at a QR code, for example, you look at an image, you're looking at a person and how this thing is going to transaction.

So at the end of the day, I will go in and make a statement that the work product of foundational companies right now. They're being designed in the way that they have a cloud component and an edge component. I think if you look about your road map of Google Gemini, you see that. You saw that the open weight OSS model from OpenAI, you see that.

So I think the models are being designed. It's the evolution of computing and I think what you're going to see is those are going to be a combination like a phone. There are going to be certain functions they're going to run on the glass, on the watch, on the phone. Certain functions are going to run on the cloud. And the last data point. You're going to see from us a big change in the computing architecture for a smartphone.

We're working on it for a couple of years -- we're going to announce, I think, as we head into 2026, maybe in the -- in regular announcements in the second half of the year. But there's a new architecture. One thing that is happening context is super important for agent experience. And even when you think about advertisement, right, if you remember, I'm going to quote when there was this incident that unmet that they lost access to some information on the iPhone and they use a lot of AI to compensate for.

So -- when you think about context that happens around you, that becomes incredibly important for an agent, incredibly important for you to create a personal graph. So we're seeing a lot of demand models that they run on the phone and they run in a very pervasive way at a very high performance or very low power just to get context. Just to basically get sensory data.

And then I think a lot of people want to do that on the phone because that's how it's going to scale and the phone in real time has real-time knowledge about what's happening -- that's also true for the devices. So that's another change of computing that we're going to see unfold. Sorry for the long answer.

Timothy Arcuri
UBS Investment Bank, Research Division

No, it's great. So how much -- in all those cases, how much R&D dollars, like how much can you repurpose versus how much do you have to spend from an incremental perspective because all those markets seem to be peripherals of what you've already -- of your existing sunk costs?

Cristiano Amon
CEO, President & Director

Look, if you look at the company financial results, with the number of bets that we're doing in parallel, right, we are -- we are in the phone business and we are like clockwork. We execute on our Snapdragon premium every year plus other Snapdragons and the rest of the road map. And we have maintained the performance leadership.

And now on top of that, we added our own CPU. We don't license anymore. We design our own CPU. On top of that, we're in the PC market, we are into the broadband market we are into the industrial market right now, automotive, including the STACK and you have now the data center and robotics. Those are the things we're all doing.

And if you look at the financial performance of the company, we have been increasing efficiency a lot, if you look at the number of bets that we're making. And that is because we leverage a lot of IP, and we create an ability to scale our IP from 5 watts to 500 watts.

We're probably 1 of the few companies. They have the engineering capability. If you think about how broad our portfolio is to scale from 5 watts to 500 watts I think that muscle we develop in the company as we set ourselves by necessity to diversify is really helping us and enabling us to do more things that have become efficient and accretive to the results of overall Qualcomm.

Timothy Arcuri
UBS Investment Bank, Research Division

Let's shift to the handset business. So you had a great quarter last quarter. You're doing very well in handsets as especially in Android. If you exclude Apple, your trailing 12-month handset revenue is up nearly 10% in a market that's basically flat. How are you doing this? And how long can you keep doing that?

Cristiano Amon
CEO, President & Director

Okay. This is an important thing that we have been seeing playing out for several years now, several years, quarter after quarter after quarter. Actually, most people don't realize, the phone market right now is still smaller than it was before COVID. Still smaller. We have not yet recovered in units, the size of the phone market that was before the pandemic.

But how are we growing on continuing to grow into a relative flat market? We have seen this trend that has been very consistent in the phone industry, especially as you get fully penetrated that the premium and the high tier expense. If you go to the United States market today. The United States market today, there's only 2 phones to buy. You have an iPhone or have a Galaxy phone. That's a 2 premium phones or you go to Walmart and you're buying a prepaid phone. There's nothing in the middle, right?

And I think China is now becoming like that. you have an expansion of the premium and the high tier, you have a low-end market. The middle is contracting. I think if you remember back when I became CEO, I think I said in 2021, I said our strategy and phone is going to be very simple.

We're going to go after share wallet, we're going to concentrate on technology leadership in premium and high on Android, and we significantly increased the up margin of the business because the premium here is more resilient, I think, to the cost of technology, once more compute, once more performance and it's been in expansion.

Even markets that are -- like India, which has been historically very price sensitive, we see an expansion of the premium tier that is happening across the board.

Timothy Arcuri
UBS Investment Bank, Research Division

I wanted to ask about India, since you mentioned it, can we actually talk about it, it seems like China maybe 10 to 15 years ago. How much of that -- how much of a driver of your interbusiness can India be?

Cristiano Amon
CEO, President & Director

Look we see continued healthy growth for that market across multiple categories. We're designing and all the Indian automotive companies. We were doing -- Snapdragon continues to gain share -- we're very happy about our Snapdragon brand position in India. It's probably as high as in China right now on consumer preference. We see a healthy expansion of the premium tier.

And here's what you need to think about it. I think I may give numbers that are not entirely precise, but they're directly correct. The phone market today, it's about 1.2 billion phones get purchase every year. $200 million is iPhones, $1 billion is Android.

So -- and a market like India, like your comparison is correct. It's a very large market. So as that android market starts to drive towards a high tier or a premium phone. That's the most important, I think, electronics purchase. That continues to drive quarter-after-quarter, year-over-year, a positive outcome of our handset business.

Timothy Arcuri
UBS Investment Bank, Research Division

Great. And then can we talk about -- there's kind of a strange dynamic where some of the major Android OEMs have their own internal efforts for modems. And yet these same customers seem to be more dependent on you than ever. Samsung, we used to say 50% baseline share, now it's more like 75% baseline share Xiaomi, you just signed a big deal with them yet they too also have their own internal efforts. How do you look at that?

Cristiano Amon
CEO, President & Director

Look, first of all, Tim -- and by the way, I don't mind we love doing this. But I've been answering this question for about 25 years. 30 at Qualcomm 25 years remember, it was managing CDMA and people said, the our customer just designed their own CMA chip. And I think historically, there has been an ongoing thing in the industry.

I think the answer to that question is you need the scale. You need to have the ability to change to technology transitions very fast. Every year, we have a new Snapdragon 8 that's kind of designed to set the pace of performance. You have -- it's a very competitive market. Maturity of silicon design is very important.

The mobile market is super unforgiven because what happened is you design on a brand-new process node, brand-new IP across your CPU or GPU, you have to bring up a whole new technology. You have to ramp from 0 to $100 million in 2 quarters. You have to make the selling season. Then you kind of tail off, you prepare to ramp the next one on an annual cadence.

Look, we're called Qualcomm. It stands for Quality communications. Some of the things we had to develop as being part of this business, some of the other semiconductor company, when they find out about it, they think we're crazy. Sometimes we go into mass production. Mass production on a design before we actually get our chip back. We haven't even got a chip back. We don't know if it has any bugs. We do a mass production because of the speed of mobile. So I think that has maintained our ability to stay in front to continue to have the designs that matter for our customers and have a leadership in IP.

And there's another thing that most people don't think about that, don't think about Qualcomm. We actually -- ourselves, we're very focused on our speed of our CPU or NPU or GPU or modem, but Snapdragon brand is very powerful. And I point you to do one thing. We had a Snapdragon Summit, and we do it every year when we launch it, 0.5 billion views impressions on the media. We had a simultaneous event in China.

Every single one of our customers decided to launch their phone at our event at the same time. If you look some of their advertisements, you see a bigger Snapdragon than you see a picture of their phone. And we didn't know. I think through the process and knowledge, like we're 39 on Interbrand 100. So Snapdragon also has a big community effect that continues to drive our customers to put Snapdragon in their flagships.

Timothy Arcuri
UBS Investment Bank, Research Division

Great. Let's talk for a moment about Apple. You've been very clear about how that's going to come out of your business. baseline being that the fall launch this year or sorry, next fall will be 20%, and it's fully out in the fall of '27 launch. However, if you look at how the internal modem is selling, not selling as well, maybe as they hoped, and there are -- there's some talk that they're kind of coming back to you again, had in hand, maybe wanting to extend a little bit longer. Can you just talk to all that?

Cristiano Amon
CEO, President & Director

Look, first and the most important thing in this conversation, I think we have been planning our business, assuming that this customer is going to go away in '27. I think that's our planning assumption. That's how we've been managing the company. That's how we're accelerating our diversification. We had provided metrics, which we're very pleased with it, how the non-Apple business in the company is growing. It showed that our strategy is working, our engine is working, and we're going to keep doing that.

Having said that, I think, look, -- we are a high, I think, quality provider of modems to them. I think it is -- it is a very reliable modem, and they can continue to use our modem as long as they want to use our modem. We just are planning our business on the assumptions that we told you, and that's what we're marching forward.

Timothy Arcuri
UBS Investment Bank, Research Division

Got it. And maybe I'll ask you the question that is still a debate. So how do you think about the lack of when they don't need about them? How do you think about the precedent that they will continue to pay a royalty and just the precedent that they say, well, Huawei is not actually paying a royalty. Now by then, they might be, but today, they're not.

Cristiano Amon
CEO, President & Director

Look, the licensing business, the oldest business of Qualcomm, I think, has been probably battle harder by at least 4 generations of wireless. And I think look, we have one of the largest IP portfolios in the world. I think the -- it's -- we're the #1 company in -- across all American companies and patterns applications in 2024. Let's see, '25 didn't close it yet.

I think we have a very strong patent portfolio, not only in cellular. We have that on video and WiFi. I think our model has been proven to be fend, been tested in any government or every regulatory agency. We battle tested that with Apple.

So look, we're confident in our position. I think like -- like every other licensee. I think I point you to -- I know you mentioned about Huawei, but every other Chinese company has renewed their agreements. I remind everyone, I think, we're licensed with Samsung, including 6G. So we think that's a very stable business, and we expect to continue to be that way.

Timothy Arcuri
UBS Investment Bank, Research Division

Great. Well, thank you for the time. We're out of time. Thank you.

Cristiano Amon
CEO, President & Director

Thank you so much. Great taking to you. Thank you.

Sent from my iPhone