SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Qualcomm Moderated Thread - please read rules before posting
QCOM 138.87+1.1%Feb 9 3:59 PM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: QCOM_HYPE_TRAIN who wrote (197597)2/6/2026 11:13:56 AM
From: Wildbiftek11 Recommendations

Recommended By
Dr. John
GR8FORM
JeffreyHF
Jon Koplik
kech

and 6 more members

  Read Replies (2) of 197636
 
I think the ARM interference and subsequent lawsuit was a pretty major roadblock to most of the data center ambitions. This was likely the result of Qualcomm's lobbying against nVidia's acquisition of ARM --the unjustified ARM attempt to block Q's advance both wasted engineering time and lowered customer engagement with Qualcomm's product lines.

As for getting out the AI200 in 2024, nVidia's major advantage at that time was in software rather than hardware and Amon's quote from a prior conference was that AI100 was about getting the software stack right before scaling out with the AI200 and 250.

The major shift was the mass industry adoption of Transformer models after OpenAI's coup in '22 over the more traditional CNNs --the corresponding shifts in optimizations needed for both software and hardware made the AI100 less competitive. Arguably, nVidia missed this internally as their DLSS models only shifted to Transformers in 2025 --their well used CUDA stack however made them quite sticky externally for training and they were able to ride that software moat up along side the hype despite the unsuitability of their hardware for inference.

So forward guidance is affected by ARM interference, and rapid shifts in fundamental language model architecture. Memory is being diverted to data centers but arguably for unsustainable projects energy, revenue, and privacy-wise (cf. nVidia's flip flop on Open AI's $100B commitment.).

It's easy to say this is obvious in hindsight, but I don't think it was obvious at the time in the fog of war given the pressures they were under when certain projects were cancelled and others that were interfered with. I still think Qualcomm holds the cards in edge / IoT and (ultimately) data-center inference and they've made the right investments and design decisions here.
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext