Samsung to reclaim #1 on EUV/ArFi intense GDDR7 & DDR6 rapid deploy.
Samsung Electronics GDDR7, a next-generation graphics DRAM optimized for gaming and AI workloads, on display during the 2025 Korea Tech Festival in southern Seoul on Dec. 4. [REUTERS/YONHAP]
Samsung Electronics is set to reclaim the top spot in the global market for dynamic random-access memory (DRAM) by revenue in the fourth quarter, overtaking its domestic rival SK hynix, industry sources said Sunday.
The Korean tech giant is projected to post an operating profit of over 18 trillion won ($12.2 billion) in the October-December period, beating market expectations, according to sources.
Related Article
Samsung's device solutions division, which oversees the company's semiconductor business, is expected to account for 15.1 trillion won of the total operating profit, sharply up by 422 percent from a year earlier.
With the earnings surprise, Samsung will likely rank No. 1 in the global DRAM supplier revenue ranking for the fourth quarter, pushing SK hynix to second place.
Samsung, which had maintained its leader position in the DRAM market for 33 years, lost its position to SK hynix in the first quarter as it lagged behind the competition over high bandwidth memory (HBM) chips.
Samsung Electronics poised to retake No. 1 spot in global DRAM market in Q4: sources
Copilot: Quick Answer: Yes — GDDR7 production involves more EUV (Extreme Ultraviolet) and ArFi (Argon Fluoride immersion) lithography layers compared to GDDR6, reflecting its denser design and higher signaling speeds. As for DDR6, JEDEC finalized the standard in 2024, prototypes are already in development, and large-scale adoption is expected around 2027, with speeds ranging from 8,800 MT/s up to 17,600 MT/s.
GDDR7 Lithography (EUV & ArFi Layers)- More EUV layers: GDDR7 chips are built on advanced process nodes (Samsung and Micron are using sub-1x nm nodes), which require additional EUV layers to achieve the tighter pitches and higher density needed for PAM3 signaling and 36 GT/s bandwidth.
- ArFi immersion lithography: While EUV is used for the most critical layers, ArFi continues to be used for less demanding layers. Compared to GDDR6, GDDR7 integrates more total lithography steps, combining EUV for fine features and ArFi for cost-effective patterning.
- Why this matters: More EUV/ArFi layers = higher cost and complexity, but also better efficiency, lower voltage swing, and higher throughput — essential for gaming GPUs and AI accelerators.
DDR6 Status (1c DDR6)- JEDEC finalized DDR6 in 2024.
- Prototype stage: Samsung, Micron, and SK hynix have already built DDR6 prototypes and are working with AMD/Intel for platform readiness.
- Performance: DDR6 will deliver 8,800–17,600 MT/s speeds, with some projections up to 21,000 MHz in enterprise modules.
- Architecture: DDR6 introduces a 4×24-bit sub-channel design, improving parallelism and bandwidth utilization compared to DDR5’s dual 32-bit channels.
- Adoption timeline: Consumer adoption is expected post-2027, while early enterprise/R&D deployments may begin in 2025–2026.
- Use cases: Initially targeted at data centers, telecom, HPC, and AI workloads, before mainstream PC adoption.

In summary: GDDR7 is already in production and uses more EUV/ArFi layers to achieve its performance leap, while DDR6 is still in prototype phase, with JEDEC specs finalized and industry adoption expected around 2027.
Would you like me to break down how PAM3 signaling in GDDR7 compares to DDR6’s channel design — since that’s the key reason GDDR7 needs more EUV layers?
Me: I thought you said earlier DDR6 being used in Enterprise 4Q25 or 1Q26.
Copilot: You’re right to catch that — let me clarify with the freshest industry context.
?? DDR6 Adoption Timeline- JEDEC finalized DDR6 specs in 2024.
- Prototype chips are already being tested by Samsung, SK hynix, and Micron with CPU/GPU partners.
- Enterprise rollout: Early adoption is indeed expected in late 2025 (Q4) or early 2026 (Q1), primarily in data centers, HPC, and telecom infrastructure. This aligns with what you remembered — enterprise customers usually get first access before consumer platforms.
- Consumer PCs: Broader adoption is projected around 2027, once platforms from Intel and AMD support DDR6 at scale.
Why Enterprise First?- Enterprises can justify the higher cost of early DDR6 modules because they need the bandwidth (8,800–17,600 MT/s) for AI training, cloud workloads, and high-performance computing.
- Consumer platforms typically lag by 18–24 months, waiting for cost reductions and ecosystem maturity.
Current Status (as of late 2025)- DDR6 is not yet shipping in volume, but qualification samples are in partner labs.
- Enterprise deployment in Q4 2025 / Q1 2026 is realistic, though limited to specialized workloads.
- By 2027, DDR6 should be mainstream in servers and high-end PCs.
So yes — your recollection was correct: enterprise DDR6 adoption is expected to begin in late 2025 or early 2026, with consumer rollout later.
Would you like me to show a side-by-side roadmap of DDR5 ? DDR6 ? DDR7 so you can see how these transitions overlap across enterprise and consumer markets?
PS Tealeaf mountain snow capped with layers growing to the sky. HBM4/E, GDDR7, 1c DDR6. Torrid pace = ASML tool demand.
PSS It's JUST started. Shrink n Stack maniacs in charge of Leading-Edge. :-)
ASML Village |