SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Semi Equipment Analysis
SOXX 297.50-2.6%Nov 6 4:00 PM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Sam who wrote (93002)9/16/2024 12:54:55 PM
From: Sam2 Recommendations

Recommended By
Julius Wong
Return to Sender

  Read Replies (1) of 95378
 
A couple of weeks old but still worth reading.

NVIDIA’s Data Center Business Fuels Explosive Growth in FY2Q25 Revenue; H200 Set to Dominate AI Server Market from 2H24, Says TrendForce
Published Sep.03 2024,16:44 PM (GMT+8)

NVIDIA’s Data Center Business Fuels Explosive Growth in FY2Q25 Revenue; H200 Set to Dominate AI Server Market from 2H24, Says TrendForce

TrendForce’s latest findings show that NVIDIA’s data center business is driving a remarkable surge in the company’s revenue, which more than doubled in the second quarter of its 2025 fiscal year to reach an impressive US$30 billion. This growth is largely fueled by skyrocketing demand for NVIDIA’s core Hopper GPU products. Supply chain surveys indicate that recent demand from CSP and OEM customers for the H200 GPU is on the rise, and this GPU is expected to become NVIDIA’s primary shipment driver from the third quarter of 2024 onward.

NVIDIA’s financial report reveals that its data center business revenue in FY2Q25 grew by 154% YoY, outpacing other segments and raising its contribution to total revenue to nearly 88%. TrendForce estimates that nearly 90% of NVIDIA’s GPU product line in 2024 will be based on the Hopper platform. This includes the H100, H200, a customized H20 version for Chinese customers, and the GH200 solution integrated with NVIDIA’s Grace CPU, which targets specific AI applications in the HPC market. Starting from Q3 this year, NVIDIA is expected to maintain a no-price-cut strategy for the H100. Once customers complete their existing orders, the H100 will naturally phase out, and the H200 will take over as the main product supplied to the market.

In China, demand for AI servers equipped with the H20 GPU has significantly increased since 2Q24, driven by cloud customers building LLMs, search engines, or chatbots locally.

Looking ahead, TrendForce predicts that the rising demand for H200-powered AI servers will help NVIDIA navigate potential supply challenges if the rollout of its new Blackwell platform experiences delays due to supply chain adjustments. This steady demand is expected to keep NVIDIA’s data center revenue strong in the latter half of 2024.

Additionally, the Blackwell platform is expected to ramp up production in 2025. With a die size twice that of the Hopper platform, the Blackwell platform is set to boost demand for CoWoS packaging solutions once it becomes mainstream next year. TrendForce reports that CoWoS’s main supplier, TSMC, has recently boosted its monthly capacity planning to nearly 70–80K units—nearly double that of 2024—with NVIDIA poised to take up more than half of this capacity.

HBM3e suppliers seize opportunity with NVIDIA’s product rollout

NVIDIA’s product lineup this year is set to make waves, with the H200 being the first GPU to feature HBM3e 8-Hi memory stacks. The upcoming Blackwell chips are also slated to fully adopt HBM3e. Meanwhile, Micron and SK hynix completed their HBM3e qualification in 1Q24 and began mass shipments in Q2. Micron is primarily supplying HBM3e for the H200, while SK hynix is catering to both the H200 and B100 series. Although Samsung was a bit late to the game, it has recently completed its HBM3e qualification and begun shipping HBM3e 8Hi for the H200, with qualification for the Blackwell series well underway.

dramexchange.com
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext