SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Qualcomm Moderated Thread - please read rules before posting
QCOM 163.32+2.3%Nov 21 9:30 AM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
Recommended by:
Dr. John
To: engineer who wrote (195769)9/26/2025 10:04:11 AM
From: Piece of beef1 Recommendation   of 196721
 
Could Qualcomm be relying on Microsoft to provide this?

From perplexity...

Yes, Microsoft provides .NET libraries that enable developers to integrate AI capabilities into applications and take advantage of AI hardware like the Snapdragon chip, especially on Windows devices.

Microsoft.Extensions.AI Libraries and Semantic KernelMicrosoft has released the Microsoft.Extensions.AI set of libraries, designed to simplify and standardize integration of AI services (including both large and small language models, embeddings, and other machine-learning features) within .NET applications. These abstractions are compatible with various AI providers and designed to work in any .NET application, allowing integration with models that can leverage hardware acceleration like what is available on Snapdragon chips running Windows.

For higher-level orchestration and advanced AI workflows in .NET, the Semantic Kernel library builds on these abstractions, making complex AI integration and composition easier.

Snapdragon, Windows ML, and AI Hardware AccelerationOn Windows devices with Snapdragon processors (such as Snapdragon X Series), Microsoft and Qualcomm's collaboration allows Windows ML and related AI frameworks to automatically use the most capable silicon (CPU, GPU, or NPU) for inference, including the dedicated AI Engine on Snapdragon chips. This enables high-performance, low-latency execution of AI workloads in .NET applications without manual hardware configuration.

Qualcomm and Microsoft have made sure that with frameworks like ONNX Runtime, .NET applications can utilize on-device AI acceleration seamlessly—Windows ML acts as a unifying abstraction so that .NET apps using supported AI models (e.g., ONNX) will benefit from NPU acceleration on Snapdragon hardware.

Developer Workflow and Ecosystem
  • .NET developers can use Microsoft.Extensions.AI and Semantic Kernel to build AI-powered apps, with support for AI model integration and hardware acceleration across providers.

  • The Windows ML and ONNX Runtime infrastructure ensure models are executed on the Snapdragon AI engine where possible, giving strong performance benefits.

  • Qualcomm’s AI software stack and tools complement these .NET features for developers targeting Snapdragon-powered Windows devices.

In summary, the combination of Microsoft's .NET AI libraries and Windows' built-in AI infrastructure provides a unified way for .NET developers to take full advantage of Snapdragon's AI acceleration capabilities.
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext