SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Strategies & Market Trends : Technical analysis for shorts & longs -- Ignore unavailable to you. Want to Upgrade?


To: Johnny Canuck who wrote (60360)10/11/2024 2:19:23 AM
From: Johnny Canuck  Read Replies (1) | Respond to of 67878
 

archive.today
webpage capture
Saved from

no other snapshots from this url
10 Oct 2024 17:41:40 UTC
Redirected from

no other snapshots from this url

All snapshotsfrom host www.barrons.com
Webpage Screenshot
share download .zip report bug or abuse Buy me a coffee




Feature
AMD CEO Predicts $500 Billion AI Chip Market by 2028. Will Its New GPU Beat Nvidia?Advanced Micro Devices is getting more optimistic about the long-term market size potential for artificial intelligence chips.


By

Tae Kim

Oct. 10, 2024 1:29 pm ET



AMD CEO Lisa Su in June.
Annabelle Chih/Bloomberg

Advanced Micro Devices is getting more optimistic about the long-term market size potential for artificial intelligence chips.
On Thursday, AMD CEO Lisa Su said AI demand had exceeded the company’s expectations over the past year. She now expects the market for AI data center graphics processing units will grow by more than 60% per year and reach $500 billion by 2028. Last December, she predicted the market would exceed $400 billion by 2027.

“We certainly have the best portfolio in the industry to address end-to-end AI,” Su said at the chip maker’s Advancing AI event in San Francisco.
At the event, she also announced AMD’s newest AI data center GPU, the Instinct MI325X. Graphics-processing units are well-suited for the parallel computations needed for AI.
Su said the MI325X incorporates 256 gigabytes of HBM3E memory, versus the 192 gigabytes of its predecessor. HBM, or high-bandwidth memory, is primarily used for artificial intelligence GPUs. Su said that MI325X platform servers can outperform Nvidia’s rival GPU, the H200, by up to 40% in certain inference benchmarks. Inference is the process of generating answers from popular AI models.

Newsletter Sign-UpBarron's Tech
Our weekly newsletter covers the hottest topics in technology from chips and computing to gaming and streaming TV—all with a lens on investing. Written by Barron’s Tae Kim.
Subscribe Now

Su said the MI325X is on track for production shipments in the fourth quarter and is expected to be available from AMD’s partners—including Dell, Hewlett Packard Enterprise and Supermicro —in the first quarter of 2025.
The Nvidia H200 may be a stale comparison as it debuted earlier this year. Nvidia’s newer more powerful Blackwell GPUs have just started reaching customers this quarter—and those are more likely to outperform AMD’s MI325X, which based on the same CDNA 3 microarchitecture as AMD’s older MI300X GPU.
In midday trading, AMD shares were down 3.4% to $165.29.
Later at its AI event, AMD previewed its next-generation MI350 GPU series, which is supported by CDNA 4 technology and slated for release in the second half of 2025. The company said the MI350 is designed to deliver 35 times more performance for inference versus the CDNA 3-powered GPUs.
Write to Tae Kim at tae.kim@barrons.com

Most Popular Today

Privacy Notice
Cookie Notice

Manage Cookies

Copyright Policy
Data Policy
Your Ad Choices
Terms of Use

Copyright © 2024 Dow Jones & Company, Inc. All Rights Reserved.


0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%