SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Rambus (RMBS) News Only
RMBS 107.76+1.2%Nov 7 3:59 PM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Thomas C. Donald who wrote (86)12/6/1998 6:15:00 PM
From: Thomas C. Donald   of 236
 
DRAM options forcing hard choices
Mark Ellsberry

11/23/98
Electronic Buyers' News
Page 24
Copyright 1998 CMP Publications Inc.

As far back as anyone can remember, DRAMs have been a commodity business. But if you look closely at the parts coming out today, it's pretty easy to figure out that we're on the threshold of market specialization.

PC-100 SDRAM, the main memory used in today's PCs, is giving way as the industry prepares for a rapid shift to Direct Rambus DRAM. But even as this transition begins, emerging memory types, like double - data - rate (DDR) SDRAM and SLDRAM, threaten to fragment the computer market.

Because of this, DRAM manufacturers and PC makers are being confronted by a series of tough choices.

Suppliers are bombarding OEMs with proposals for improving DRAM performance, leaving their customers two options: pick one memory type to create a single, new standard, or allow the technology path to fracture into as many as three new directions.

Indeed, if everyone had their way, vendors next year would be dealing with EDO, page mode, PC-66, and PC-100 SDRAM; Direct RDRAM; DDR; and SLDRAM parts, all at the same time.

In addition to the wider field of DRAM architectures, manufacturers must also make choices regarding new memory densities. Historically, the industry has expanded from one generation to the next by a factor of four, i.e. 4-, to 16-, to 64-Mbit parts.

But the move to 256-Mbit DRAM presents suppliers with a two-fold technology challenge: To ensure that parts are cost-effective and available in the right package, vendors must migrate to a design rule no larger than 0.18 microns, while at the same time converting to larger, 12-in. wafers. Both challenges represent significant technical hurdles, and are very expensive transitions at a time when the industry is not well set up to make these kinds of investments.

In fact, the hurdles are so large that they will force the industry from an historic trend. For the past 20 years, DRAM densities have generally followed three-year cycles. From 1994 to 1997, for example, 16-Mbit DRAM was the most cost-effective part. If history repeated itself, then the 64-bit would be the most cost-effective part from 1998 through 2000.

However, Hyundai and other companies have elected to stick a half-generation, 128-Mbit device in this time slot. The result is that customers will get the denser memories they really need at better cost.

To do this, however, the industry will have to borrow from the 64- and 256-Mbit generations. Instead of 64-Mbit appearing as the most cost-effective part until 2000, it will extend only to 1999. In 2000 and 2001, 128-Mbit will be the most cost-effective part, with 256-Mbit taking on that distinction in 2002.

Memory configurations present manufacturers with other issues. In the past, the DRAM industry has improved chip performance by going to a wider part. A x1 device, for instance, took about 100 ns to decode one bit of data, while a x16 part takes about the same time to decode 16 bits of data. This technique worked well, but its cost-effectiveness is exhausted at x32.

Switching to Rambus may correct this issue. Direct RDRAMs transfer data at either 600 or 800 Mbytes/s-depending on the system design-while using an 8-bit-wide bus. But the performance improvement will come at the cost of device complexity.

These factors are forcing DRAM suppliers to make some hard decisions. In the past, the premise was to build DRAM to be the smallest, most cost-effective memory.

However, OEM performance requirements are starting to splinter that low-cost, small-die model. That's fine, as long as OEMs recognize that DRAMs may not achieve the same economies of scale as before. The historic learning curve, which has enabled suppliers to lower costs by an average of 30% annually over the past 25 years, may be affected.

The bottom line with this new, expanded DRAM menu is that we're on the verge of losing standard-part pricing. The traditional method of comparing different costs by factoring a premium into the price of a commodity architecture may become meaningless if there is no common denominator available for comparison.

-Mark Ellsberry is vice president of marketing for the Semiconductor Division of Hyundai Electronics America, San Jose.

November 23, 1998
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext