To: voop who wrote (195608 ) 9/13/2025 12:14:24 PM From: Jim Mullens 10 RecommendationsRecommended By BDAZZ GR8FORM John Hayman Jon Koplik Lance Bredvold and 5 more members
Respond to of 196971 Voop, re: AI on a laptop? It's more realistic than you think- Register QCOM sponsored articlelikens it to the shift he saw as PCs replaced mainframes.... On-device AI promises the same evolution for AI computing, Snip >>>>>> Upendra Kulkarni, VP of product management at Qualcomm Technologies Inc , >>>> Speaking about the Snapdragon® X Series processors are a major catalyst for local on-device AI, thanks to their neural processing units (NPUs). Snip>>>>. There's a shift happening in personal computing thanks to the rise of on-device AI. That's comparable to previous computing revolutions including the cloud itself, and mobile computing. Upendra Kulkarni, VP of product management at Qualcomm Technologies Inc , likens it to the shift he saw as PCs replaced mainframes. "We used to program on IBM 3090 and DEC VAX PDP-11 computers tucked away on a campus far away from you," he recalls. "You had in front of you a simple terminal, a display and a keyboard, and all the compute was somewhere else that you didn't see". Then came the PC, which localized a lot of the compute. The centralized computing resource didn't go away, but the PCs complemented it. On-device AI promises the same evolution for AI computing , he says. CoPilot >>>>>>>>>>>>>>>>> Q- Which company is leading On-Device AI >>> A- As of 2025, the company leading on-device AI —especially in terms of generative AI capabilities, developer access, and real-world deployment—is Qualcomm .?? Why Qualcomm Leads On-Device AI 1. Hardware Dominance Snapdragon chips (like Snapdragon 8 Gen 3 and Snapdragon X Elite ) deliver up to 45 TOPS of AI compute. Integrated Neural Processing Units (NPUs) are optimized for running large language models (LLMs), vision models, and multimodal AI directly on-device. 2. GENIE Software Stack Qualcomm’s Gen AI Inference Extensions (GENIE) allow developers to run complex generative AI models locally. Supports models like Llama 2 , Stable Diffusion , and multimodal assistants—without needing cloud infrastructure. 3. Developer Ecosystem The Qualcomm AI Hub provides pre-optimized models, sample apps, and SDKs for easy deployment. Full access to the AI Engine Direct SDK , enabling custom model execution on mobile, PC, and automotive platforms. 4. Real-World Deployment Samsung Galaxy S24 Ultra, Lenovo AI PCs, and upcoming automotive platforms are already using Qualcomm’s on-device Gen AI. These devices support features like real-time summarization , image generation , translation , and chatbot-style interactions —all locally. How Others Compare Company On-Device AI Strengths Limitations Apple Efficient Neural Engine for system tasks No support for custom Gen AI models Google Tensor chips in Pixel phones Limited developer access to full Gen AI MediaTek AI enhancements in mid-range devices Less powerful NPU, no Gen AI support NVIDIA Dominates cloud AI, emerging in edge AI Focused on data centers, not mobile
Verdict Qualcomm is the clear leader i n on-device AI for 2025—not just in hardware performance, but in enabling real generative AI experiences across mobile, PC, and embedded platforms. It’s the only company offering a full-stack solution that developers can use today to deploy LLMs and multimodal models locally.