Evercore eyes Arista, Cisco as key beneficiaries in AI switching opportunity
May 28, 2025 11:35 AM ET By: Brandon Evans, SA News Editor
The debate surrounding the transition to artificial intelligence switches from traditional data center switches has evolved beyond Ethernet versus InfiniBand to which specific vendors will make the most of the opportunity, according to Evercore ISI.
While the deal between Cisco Systems (NASDAQ: CSCO) and Nvidia (NASDAQ: NVDA) prop Cisco into a prime leadership position on enterprise AI deployments, Arista Networks (NYSE: ANET) could become a lead beneficiary on the Tier-1 hyperscale side, Evercore finds.
The switching for back-end networks is expected to create a total addressable market of $29B by 2029. It was $6.3B in 2024, according to data from the Dell'Oro Group. The top four U.S. hyperscalers will account for $14.6B, while the top three China hyperscalers will account for $3B. Enterprise will comprise $5.6B, while the rest of the cloud market makes up $5.2B.
"We estimate Arista could add $7B in revenue and $2.00+ in EPS from reaching our market share targets by CY29 (~100% of CY24 revenue run-rate)," said Evercore analysts, led by Amit Daryanani, in an extensive Wednesday note. "We estimate Cisco could add $3.5B in revenue and $0.25 in EPS if they hit our targets (~7% of CY24 run-rate)."
"We think the 'AI centric' networking markets are poised to sustain sizable revenue growth over the next few years," Daryanani added. "This should unlock sizable upside for both Arista and Cisco going forward. Outside of our coverage, we see material Ethernet share for both Celestica (NYSE: CLS) and Nvidia."
Ethernet will hold the majority, or 64% of the growth by 2029, and Infiniband will account for 36%, according to Dell'Oro. However, Evercore said Ethernet's share could be as high as 90% by that time.
"AI infrastructure building will be heavily concentrated within a few cloud service providers in the US and China," Daryanani noted. "Large Enterprises will also do some building in house, but we only expect cloud service providers to be willing to commit the capex necessary to build and train large foundational models in clusters made up of 10s to 100s of thousands of GPUs. Large enterprises instead will focus on both fine-tuning the large foundational models they will acquire via a model as a service agreement and inferencing models."
Meta Platforms ( META), Microsoft ( MSFT), Google ( GOOG)( GOOGL) and Amazon ( AMZN) are the top four U.S. AI hyperscalers.
While traditional data center switches are general-purpose networking devices, AI data center switches are necessary to handle the demands of AI and machine learning workloads. |