Heated Competition Driven by the Booming AI Market: A Quick Glance at HBM Giants’ Latest Moves, and What’s Next 2024-06-03
To capture the booming demand of AI processors, memory heavyweights have been aggressively expanding HBM (High Bandwidth Memory) capacity, as well as striving to improve its yield and competitiveness. The latest development would be Micron’s reported new plant in Hiroshima Prefecture, Japan.
The fab, targeting to produce chips and HBM as early as 2027, is reported to manufacture DRAM with the most advanced “1?” (gamma; 11-12 nanometers) process, using extreme ultraviolet (EUV) lithography equipment in the meantime.
Why is HBM such a hot topic, and why is it so important?
HBM: Solution to High Performance Computing; Perfectly Fitted for AI Chips
By applying 3D stacking technology, which enables multiple layers of chips to be stacked on top of each other, HBM’s TSVs (through-silicon vias) process allows for more memory chips to be packed into a smaller space, thus shortening the distance data needs to travel. This makes HBM perfectly fitted to high-performance computing applications, which requires fast data speed. Additionally, replacing GDDR SDRAM or DDR SDRAM with HBM will help control energy consumption.
continues at trendforce.com
I am not so certain about the road maps that they cite here as the companies not infrequently put out dates that they are pretty certain they can beat and then do exactly that. |