SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Rambus (RMBS) - Eagle or Penguin
RMBS 91.18-4.3%Nov 17 3:59 PM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Bilow who wrote (39937)4/25/2000 5:26:00 AM
From: Bilow  Read Replies (3) of 93625
 
Hi all; Since the first DDR shipments are in graphics systems, I thought it would be nice to assemble a set of links illustrating the history of how DDR won that market from Rambus. Towards that end, I went back in time by over 5 years, (to when I was a younger, thinner man), and got the links to a series of industry articles.

For those of us who are in this business, and have followed this stuff for all of our adult lives, none of what is contained here is any sort of surprise. But to people who are buying into tech stocks based on a lick and a promise, maybe there is something to be learned by taking a look at this sequence. Maybe reading a little of the engineering history will let a person or two understand the way this industry works a little better. Disproving the fantasy belief that the industry press is hopelessly anti Rambus, I show links to early articles extolling the virtues of that technology.

It is very hard to predict what is gong to happen in electronics 5 years in advance, but it is simplicity itself to make predictions 1 year. The reason is that everybody takes about a year to design new products, and the details leak out long before the products hit the market. Reading through these links will provide an understanding of how this process works, and give some insight into how why it is obvious to memory designers that DDR takes the big spotlight this summer. But for that story, you will have to read other posts.

First, a note on granularity, and why Rambus is the winner in the granularity contest for graphics, and what this has to do with bandwidth.

Graphics is important to the Rambus story because graphics has always led the industry in terms of requiring relatively high bandwidth out of a relatively small number of chips. Originally, the reason for this was because of the relatively small number of bits in an image screen, relative to how frequently they must be displayed. As a (simplified) example, a 1280x1024 graphics screen, with 24-bits for each pixel, refreshed at a 80Hz (vertical) non interlaced rate, with 15% dead time during horizontal and vertical retrace, on a simple frame buffer, will require read bandwidth of:
80Hz * 1280 * 1024 * 24 /(1-.15)*(1-.15) = 3.5Gbits/sec = 435MBytes/sec.
If we want sufficient write bandwidth to rewrite the screen in one frame, we have to double this, so we get 870MB/sec. The amount of memory required for two images, one presumably displayed and the other being worked on, is
1280 * 1024 * 24 * 2 = 64Mbits = 8MBytes.

SDRAMs were first sold into the graphics market, as a partial solution to the graphics bandwidth problem. And SDRAMs have a lot more bandwidth than the preceding technology. But if we take an industry standard x8 SDRAM, clocked at 100MHz, we can get a theoretical bandwidth out of it of only 100MBytes/sec, and less when we take into account the time required to turn the bus around, change banks, refresh, etc. Thus we would have to use 8 of these chips to get that 435MBytes/sec bandwidth that the 1280x1024 frame buffer would require. But since about the smallest chip we can get nowadays are 64Mbit = 8MByte, this would leave us with 64MBytes of memory, which is 8x as much as we need. Engineers hate wasting all those bits, but with graphics, that is the problem. Going to x16 chips reduces the problem by a factor of two, but we still have 4x as much memory as we need. Memory technologies before SDRAM were even worse. I shudder to think of them.

A simple way of expressing the graphics bandwidth problem is that each bit of graphics display memory, has to be read from or written to once for each vertical retrace, assuming double buffering of the display. More modern high end graphics controllers can use more memory to describe the image than the completed image actually contains. As an example, the Nvidia DDR system uses 256Mbits = 32MBytes of memory. In order to read and write all that memory 60 times per second, you would need a bandwidth of 32MBytes * 120/sec = 4GBytes/sec, which is about how much bandwidth the Nvidia card actually provides. Thus with more modern technology, the memory bandwidth problem doesn't get better. It still boils down to something like 120 or so reads or writes per bit per second.

Those of you who have programmed some of the old graphics adapters may have noticed that the cards tended to have a lot more memory on them than was required to contain the display image. They would provide you with extra display image maps, and you could change which one was displayed in order to generate crude animation. Those extra banks of display were there because the engineers had to buy more memory than they needed in order to get the bandwidth. They took the extra memory, and as long as they had to include it anyway, called it a feature instead of an inefficiency.

Rambus technology puts a lot more bandwidth into each chip, a much than can be got out of an SDRAM or EDO, or even a DDR. This allows the graphics bandwidth requirement to be met with a smaller number of chips. You could buy just one (direct) or two (concurrent) RDRAM chips per board, and get all the graphics bandwidth you needed. Even though the Rambus interface was more difficult, and even though the chips would carry a premium, you could still save money by not having all that extra, unneeded memory around, and therefore buying fewer memory chips. This was very attractive to graphics designers.

As time went on, the memory makers continued to increase the amount of memory in each chip faster than they increased the bandwidth out of the chip. Thus the graphics bandwidth problem got worse and worse. Graphics makers would end up with more and more unwanted memory. Six years ago, it was pretty clear that Rambus would own the graphics market, or that better techniques for interfacing to memory would have to be developed.

These comments apply to both the old ("concurrent") and the current ("direct") RDRAM, but direct RDRAM never made it into the graphics industry. Something happened, and the Rambus people aren't going to tell you what that was. Maybe its the simple fact that Intel didn't try to force Rambus technology onto the graphics chip market. They probably would have, but they didn't have enough of a percentage of that market to make it stick. And one of the inevitable consequences of the ongoing Rambus debacle is that Intel will lose its lead in the chipset market, the very market that it used to force the memory makers into Rambus.

The History of Rambus and the Graphics Industry... With Links

Back in 1995, it seemed pretty clear that Rambus was a natural for the graphics industry. And they did get some important design wins, but there was always a little hesitation in the industry. These first two stories are about concurrent Rambus DRAM design wins in graphics:

MPU Developments Give Rise To Tough, Pivotal Choices Apr 24, 1995
Despite its compelling advantages of increased speed and greater bandwidth, both of which would make Rambus DRAMs particularly attractive to producers of high-performance 64-bit video subsystems like those to be found in high-end Windows 95 workstations, Rambus appears to be out of favor with many of the top-ranked PC suppliers, who balk at its high cost and the fact that it remains largely unproven.

The exception may be Silicon Graphics Inc., which is said to be working on a Rambus graphics controller for Windows 95 workstations based on 90-MHz or faster processors. The SGI controller is expected to be introduced later this year.

techweb.com

Cirrus unwraps Rambus graphics line Jun 26, 1995
Cirrus Logic Inc. today will unveil its architecture for high-end multimedia graphics controllers, showing an entirely new Rambus-based design that eventually will include graphics, video, 3-D and communications functions.
techweb.com

With that success, Rambus decided to make modifications to their concurrent memory design, in such a way that it could be used also in main memories, the big Kahuna:

Rambus partners to build 64-Mbit RDRAMs Sep 18, 1995
The changes build on Rambus's experience in graphics hardware and game consoles, as well as arm the RDRAM for an assault on main memories.
...
Sources indicate that Cirrus Logic-already an announced Rambus licensee-is working toward a unified-memory Rambus core-logic and graphics chip set for use at the end of 1996. At least one systems vendor-reportedly Compaq Computer Corp.-is working in that direction independently, developing its own core logic.

techweb.com

Here's a direct reference to the graphics bandwidth problem. Written by someone who really believed that Rambus was the technology of choice for graphics cards:

Rambus is the choice for bandwidth Nov 13, 1995
With the increasing number of digitized images racing across higher-resolution display screens, the heat is on the graphics controller and frame buffer to deliver far more bits per second-and that means bandwidth.
techweb.com

But 3 years later, the high costs (particularly manufacturability) of Rambus had eliminated the company from the graphics market, and DDR was being talked about. RDRAM wasn't considered much of a contender anymore, and the reason was high costs:

PC100 compliance the key in'98 Feb 9, 1998
The graphics DRAM market will evolve from SDRAMs to 200-MHz DDR in 1999 to 300-MHz DDR in 2001. Interestingly enough, we see the main- and graphics-memory segments being divided, with distinct differentiation between the two. Graphics-memory applications will be x32 devices, for instance, 512-Kx32 or 1-Mx32 or 2-Mx32. However, main memory will remain at x4, x8, and x16.
techweb.com

Here, various companies, having talked to the graphics industry, determine that it is time to start putting samples of x32 SGRAMs (which are almost identical to SDRAMs, other than the width). At the same time, RDRAM isn't mentioned as a contender in graphics by the memory companies, but Rambus still holds out hope that they will reconquer their natural market segment, graphics:

A Market Of Many Stripes Feb 9, 1998
Siemens expects in the third quarter to see first silicon on a 146-MHz, 16-Mbit (x32) synchronous graphics RAM (SGRAM) from a 0.24-micron process. The company has a double-data-rate (DDR) version on tap, as well as an SLDRAM.
...
Hitachi expects to sample a 256-Mbit SDRAM around mid- year. The company licenses the Direct RDRAM for mobile and lower-end PC applications and plans a 64-Mbit DDR SDRAM later this year and a 256-Mbit DDR SDRAM in 1999 for large memory systems, fault-tolerant systems and servers, workstations, minicomputers, and mainframes.
...
Next summer, NEC plans to sample a 16-Mbit (x32) SGRAM, and by the end of 1999 the company will follow with a second-generation DDR SGRAM built around a VCM core.
...
The first two generations are targeted at low-end graphics applications as well as electronic toys and games, whereas Direct RDRAM is being positioned as the consumer PC's main memory, supporting a new family of chips sets and processors that Intel is developing. Rambus is also eager to exploit the graphics, communications, consumer, and mobile-computer markets.

techweb.com

The one-pony show hypesters at Rambus were still putting forth, claiming design wins that then evaporated away, but these were old wins, not new ones. Still, that didn't stop them:

Memory ICs Feb 9, 1998
Subodh Toprani, vice president of marketing at Rambus (Mountain View), said concurrent {i.e. Rambus} DRAMs have been designed into graphics subsystems of high-end systems offered by Compaq Computer, Dell, Gateway, Hewlett-Packard and NEC.
techweb.com

Now DDR chips provide twice the bandwidth of SDRAM chips, but when a technology provides more bandwidth than is needed, it just doesn't get used. In mid 1998, x32 chips provided enough bandwidth for the graphics market, DDR just wasn't needed. Here is an example of a design loss by DDR. This is what happens to technologies that show up before they are needed:

High-end DRAM finds niche markets Jul 20, 1998
MoSys, for example, which specializes in high-end graphics frame buffers, abandoned earlier efforts to design a double-data-rate (DDR) SDRAM, citing a negligible speed improvement over its Multibanked SDRAM architecture.
techweb.com

Meanwhile, a reduction in the cost of pins was shaking the industry. All of a sudden, those x32 packages didn't seem so wide anymore. Pins were getting cheaper and closer together:

DRAM market changing? -- It's more complex Oct 12, 1998
Two years ago, some networking and printer OEMs indicated an interest in a 2-Mbit x 32 organization to allow them to take advantage of the better 64-Mbit cost structure. But DRAM suppliers responded by saying a x 32 would cost them a 30% to 40% price premium. So that idea quickly vanished.

Now, they're back revisiting the x 32. As it turns out, the graphics market shows a strong need for 2-Mbit x 32 SDRAM. When this trend takes on greater momentum, several other markets can use this particular specialty device.

techweb.com

Since DDR could double the bandwidth out of SDRAM, it was a natural extension for use when the graphics machines were going to need it. But it could be used by other markets as well, and IBM began pushing it:

Will Direct Rambus Rule the Roost? Dec 7, 1998
DDR possesses some technical and economic merits that make it a very attractive solution for next-generation, high-bandwidth systems, according to Lane Mason, director of graphics/memory product strategy at IBM Microelectronics, Burlington, Vt. "The people who are true believers in DDR on the systems side are very strong proponents, have it worked into their roadmaps, and are working with all the vendors to make sure that it is a price/performance success," Mason said.
techweb.com

As the newer high speed graphics cards began design (like the Nvidia card), it became obvious that DDR was going to get design wins. This began to leak into the trade press in late 1998, about a year before the first DDR cards hit the market. People with an ear to the thoughts of industry began looking towards the next replacement for main memory, which traditionally follows the lead of the graphics memory market (when the graphics market uses a memory type that is compatible with main memory). The memory makers began producing DDR as a bonding option. This means that they can do all but the last few steps in making a memory chip before deciding whether to make it an SDR or DDR device. This also means that the memory makers can turn on the spigot for huge volume whenever they want to:

DDR picks up steam as next-gen DRAM choice Dec 7, 1998
Graphics board manufacturers are also likely to adopt the DDR memories, said Victor de Dios, a DRAM market analyst based in Neward, Calif., in part because of DDR's excellent latency and acceptable bandwidth. Many graphics manufacturers compete on price, de Dios said, and some of them will not spend on a Rambus license.

DDR DRAMs could achieve 12 to 17 percent of the total DRAM market by 2001, with as much as half of them going into graphics subsystems and the rest into servers, de Dios said. By late 1999, most higher-density SDRAMs will offer DDR capability as a bonding option, he said.

techweb.com

Industry began working on the next generation DDR, even though the previous version wasn't even in mass production. Industry guys can do this, not because they have time to burn, but because they have insight into where the technology is going. And they are a lot more likely to be right than the marketing flacks, who typically concentrate on moving whatever is the current product out the door:

Group proposes next-generation DDR standard Apr 12, 1999
Joe Marci, chairman of the JEDEC panel and technical manager of ArtX Inc., a Palo Alto, Calif., graphics-design company, said DDR-2 will allow a natural migration path from the DDR SDRAMs slated to enter the market later this year.
techweb.com

The debut of DDR is announced around 3 months in advance:

DDR SDRAM Will Make Debut In Graphics Arena May 31, 1999
Double-data-rate SDRAM will crack the PC market in graphics chipsets beginning this fall, a jump-start many memory-chip vendors believe will position DDR as a high-speed contender in other applications as well.
...
"Graphics-chipset vendors are always the first to jump on a new, faster DRAM,"

techweb.com

The industry starts getting ready to change the SDR SGRAM lines over into DDR SGRAM lines. Just the usual natural progression:

Hyundai Targets Graphics Jul 20, 1999
Hyundai and several other industry suppliers are projecting that DDR SDRAM will replace single-data-rate memory chips next year as the preferred choice for graphics frame buffers. ($14 each)
techweb.com

Double-data-rate SDRAMs offer 2.1-Gbit/s bandwidth Sep 20, 1999
Mark Ellsberry, vice president of marketing for Hyundai's Semiconductor Division, said DDR has become the de facto standard for high-end graphics applications where the very highest bandwidth is essential.
techweb.com

Again, the industry knows that DDR leads the way with recent memory types (including Rambus), then the rest of the industry follows:

trying to predict how the segmented DRAM market will fare Oct 11, 1999
DDR technology will be used first in graphics chips and later as main memory in PC desktops and servers.
techweb.com

The first graphics production begins, and we all know about it now. But there are a lot of other graphics products coming out with DDR soon:

Hyundai Micro Ramps Up DDR SDRAM Production Nov 22, 1999
The first deliveries of the devices will be to Nvidia
techweb.com

This is what the industry is saying now, contrary to what Rambus and Intel are saying.

Analysts: DDR SDRAM to take 50% of memory mkt. Jan 17, 2000
The technology is already picking up speed as dedicated memory for graphics cards
techweb.com

-- Carl

P.S. Does anyone have the simple reason why RDRAM doesn't work well for the graphics industry?

richard surckla: "To my knowledge DDR has been targeted to the graphics processors. DDR seems to work good in grahic processors. For some reason graphic processors with RDRAM don't work well. I'm told it has something to do with the granularity." #reply-12038375
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext