SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : NVIDIA Corporation (NVDA)
NVDA 177.00-1.8%Nov 28 9:30 AM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
From: Frank Sully8/13/2021 1:21:14 AM
   of 2646
 
Nvidia has 80% share of AI processors, Omdia says, except…

by Matt Hamblen |
Aug 12, 2021 1:36pm



Intel Xeon processors are CPUs. While they don't technically fit under some definitions of an AI processor for data centers and cloud, CPUs are the most widely used chips for AI acceleration. However, Nvidia holds the top spot for AI processors with its GPUs that integrate distinct subsystems dedicated to AI processing. It's all how you count it. (Intel)
Analyst firm Omdia recently ranked Nvidia at the top of the AI processor market with an 80% share in 2020, well ahead of its competitors.

The tabulation puts Nvidia AI processor revenue at $3.2 billion in 2020, an improvement from $1.8 billion in 2019. Omdia ranked Xilinx second with its FPGA products.

Google finished third with its Tensor Processing Unit, while Intel finished fourth with its Habana AI ASSPs and its FPGAs for AI cloud and data center servers. AMD ranked fifth with AI ASSPs for cloud and data center.

The report is notable for leaving out Intel Xeon CPUs in the Omdia tabulation even though Xeons are used extensively for AI acceleration in cloud and data center operations, as Omdia admits. Xeon does not meet the Omdia definition of an AI processor which includes “only those chips that integrate distinct subsystems dedicated to AI processing,” Omdia said.

As far back as 2019, Intel began a heavy push for Xeon for a variety of AI workloads alongside other compute tasks and has seen steady progress. In April, Intel launched its third generation Ice Lake Xeon Scalable processor with what Intel called built-in AI.

At the time, Intel said it already had shipped 200,000 of the 10nm processors in the first quarter and boasted it offers 1.5 times higher performance across 20 AI workloads when compared to AMD EPYC7763 and 1.3 times higher performance versus the Nvidia A100 GPU. Cisco announced in April it was using Xeon third gen in three new UCS servers.

Jack Gold, an analyst at J. Gold Associates, said it is hard to know accurately Intel revenues for Xeon chips used for AI jobs because of how Intel reports earnings. But he estimated at least half of the latest Data Center Group revenues for second quarter of $5.6 billion were from Xeon, or more than $2.8 billion. (Xeon is also used in 5G implementations, not a part of the Data Center Group.) Based on Gold’s estimate, Xeon for data centers could be as much as three to four times the size of the Nvidia AI processor revenue in 2020, although not all the Xeon CPUs would be used for AI work.

Omdia’s definition of AI processors only includes GPU-derived AI application-specific standard products (ASSPs) , proprietary core AI application-specific standard products, AI application-specific integrated circuits (AI ASICS) and field programmable gate arrays (FPGAs)

“Despite the onslaught of new competitors and new types of chips, Nvidia’s GPU-based devices have remained the default choice for cloud hyperscalers and on-premises data centers, partly because of their familiarity to users,” said Omdia principal analyst Jonathan Cassell. GPU-based chips were the first type of AI processor widely used for AI acceleration.

Cassell told Fierce Electronics that CPUs are the most widely used chip for AI acceleration in the cloud and data center, even though Omdia doesn’t classify them as AI processors. If they were counted as AI processors, Intel would be the leading supplier in AI in 2020, beating Nvidia, he added.

CPUs are mainly used for AI inference work in cloud and data center environments as opposed to AI training. In an older Omdia report from 2020, the analyst firm calculated that CPUs accounted for the majority of market share for 2019 and were estimated to lead the market through 2025.

The market for AI processors is growing rapidly, with global revenues for cloud and data center AI processors reaching $4 billion in 2020. Revenue should grow nine times by 2026 to $37 billion, according to the definition used by Omdia.

fierceelectronics.com
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext