Hi Web Myst; There's a lot more going on on my computer than the RMBS thread. But this is entertainment.
Think about it. There's what, 200 million people in the US. Doesn't it make sense that at least one of them would have as a hobby posting to the Rambus thread? (LOL!!!)
Most people are experts in something, my area is the design of memory systems for high volume commercial manufacture. In other words, this is my territory.
I know that the long time Rambus longs see me as an interloper, but try to look at it from a memory designer's perspective. If you listen to the BS from Rambus, it is natural to believe that the only competent memory design engineers in the world work for Rambus.
Let's repeat this Rambus story:
(1) A couple of academic geniuses got together and over the course of a few days worth of conversations, discovered a way to revolutionize the way that DRAM chips are connected together.
(2) Industry, being stupid, fails to see the brilliance of their plans and largely ignores them. But they do get design wins at a few places, with most volume in the Nintendo game console.
(3) They then go to Intel, and Intel recognizes the brilliance of their ideas, and they sign an agreement. The agreement is that Intel will use only RDRAM in their next generation (i.e. faster than SDRAM) desktop memory designs, and in return, Rambus will give Intel a pile of stock warrants, and won't charge Intel any royalties on the controllers.
(4) The memory makers, being not particularly competent, effectively sabotage Intel's conversion to RDRAM by not producing enough chips. When it becomes clear that RDRAM will face severe competition from DDR for next generation memory status, Rambus presses the industry to pay royalties on DDR and SDRAM. After all, it was Rambus that invented the ideas in SDRAM and DDR.
Okay? That's the Rambus side of the story. Of course it is a completely twisted version of reality.
But more importantly, in order for the Rambus version of "reality" to be true, you have to accept that the academics who started Rambus really did have better design skills than the industry did. You have to believe that Rambus really did invent SDRAM and DDR. You have to believe that RDRAM really is superior technology, and that the reason RDRAM is superior to JEDEC standard is, well, because those Rambus geniuses are just smarter.
Now try to look at it from the other side:
A couple of impractical, ivory tower intellectual theorists with great looking resumes, but with no practical experience in the DRAM industry, come up with a overly expensive idea to "improve" on memory technology.
The industry tells them that it's not a good idea, but the technology sounds pretty good to (especially management, who tend to be impressed by things like well written resumes) a few companies. The technology is in competition with other niche type graphics memories, so the higher costs are considered "reasonable", and a few companies design the stuff into products. Primarily, Nintendo tries out the first generation DRAM ("base"), ships it with their game console. (Note that Nintendo dumped RDRAM for their next game console, to be out in a year or so.)
Industry develops SDRAM by the trivial expedient of adding registers to the inputs and outputs of regular DRAM chips. Industry adds similar features to SDRAM as were used in RDRAM, but these features were ancient EE tricks, certainly nothing that Rambus could claim as an invention (except the multiplexed bus, which industry didn't use). Most of the work in designing SDRAM is in arguing over details between the memory makers.
Then the academics talk to Intel. Intel says to them that they see that RDRAM will save pins (Intel management doesn't realize at the time that the price of pins is about to drop through the floor, so they think that saving pins is important) but the only problems with the technology are that Intel wants to be able to put at least 4 RIMMs on each RSL memory channel, and each channel has to provide 0.8GB/sec bandwidth.
Rambus says, "sure, no problemo", and goes off to design the thing. Rambus comes up with a design that while theoretically possible, doesn't provide any margin for error, deterioration, etc., in the components. They quickly give up on 4 RIMMs per RSL and knock it down to only 3 RIMMs per channel. In order to get as much bandwidth as (what will become) PC100, they have to double the bus width to 16/18 bits, and they have to increase the frequency to 800Mbps/pin.
Now even having three RIMMs per channel means that at 800Mbps/pin each wire will carry multiple data bits at the same time. In addition, to save power, they decide to terminate the RSL channel on only one side. (LOL!!!) This forces READ data to go down the channel twice, the second time rebounding from the controller end of the bus. (Running the signals down the channel twice means that the nonlinearities and errors in the system get to be brought in twice.)
Industry looks at this and says that this isn't a good idea. People like me laughed. After I saw the full specs, I posted that this was going to be the worst manufacturing disaster in years, and that it would be buggy as hell. But that was later. Now, in order to get this to work even theoretically, Rambus has to leave very little engineering margin on the memory designs, and the memory makers are unable to produce parts.
So the rollout gets delayed. Then, when the memory makers finally get chips made, it's found that they're buggy when installed into Camino MBs with 3 RIMMs. LOL!!! This was not a surprise to industry, why do you think that industry didn't make the mistake of ramping the sh:t into production at the rate that Intel asked for it? So Camino is recalled just before it's supposed to ship in big volume and Intel has to shred the motherboards. They decrease the bus length by 1/3 (by restricting the design to only 2 RIMMs), and actually get product to ship.
Now Intel is left with a memory design that isn't nearly as robust as the one that Rambus had agreed to design, it doesn't take even 3 RIMMs, but what does Rambus do? They blame Intel for not meeting their spec! (Great way to treat a customer, LOL!!!) By the way, this restriction of RDRAM channels to only 2 RIMMs is still present. The 840 gets by with 4 RIMMs because it only has two RIMM slots per RSL channel.
But "value" motherboards must have only one memory channel. (That is, you can't go around paying for double the termination and clock drivers etc., as well as requiring the customer to install RIMMs two at a time for a cheap motherboard.) The single channel RDRAM solution from Intel (820 Camino) ends up with a bad reputation for being a waste of money and Intel knows that this is a problem that is incurable.
In the meantime, the noise that memory makers had been making to the effect that "RDRAM will only get to be cheaper than SDRAM in the event that RDRAM ships a lot more than SDRAM" starts to resonate among memory designers. They think to themselves (and their buddies), "if RDRAM can't be put into a decent single channel motherboard, there is no way that RDRAM will hit the high volumes it needs in order to get cheaper than SDRAM. In addition, all my memory design buddies are saying the same thing, and if they're all saying that they don't think RDRAM is going to get cheap, and that they are therefore not going to use RDRAM for their new designs, well, then I'd better not use it either, because it will always be 50% more expensive than SDRAM."
So in late 1999, memory designers abandon RDRAM in mass, and migrate to either DDR or SDRAM. In fact, industry support (in terms of prototyping equipment, test equipment, etc.) has always been very strong for SDRAM and moderately strong for DDR, but comparatively weak (i.e. rare and expensive) for RDRAM. Consequently, a lot of memory designers had avoided RDRAM already. (Go look up the support for RDRAM at Altera or Xilinx, the primary FPGA manufacturers, to see this in action. There is, and has been, zero support for RDRAM and full support for DDR. Memory designers who prototyped designs using FPGAs were forced to use DDR or SDRAM.)
So with RDRAM flunking in the mainstream "value" PC market, it became obvious that RDRAM was never going to get out of niche status, and that since it's features were all available from DDR (except pin count, but pins had gotten very cheap), it was now obvious, in late 1999, that RDRAM was dead, dead, dead.
Of course it was also obvious to Rambus management that RDRAM was dead, dead, dead, so in early 2000 they sprung "plan B" on the industry - collect royalties from SDRAM and DDR. That resulted in a big price spike on RMBS, but we all know where that went. Note that RMBS management did manage to sell quite a few million dollars worth of their stock before the inevitable crash.
So is the story over?
So now, here we are in mid 2001 and Intel is just beginning to transition out of RDRAM. Why am I still here? It should be clear: I'm here to watch the idiots who ran Rambus finally get theirs.
Whenever a government executes someone, they never have trouble signing up an audience. I'm here to watch the execution of Rambus. Unlike a human execution, this is a slow business. Things do not move quickly in the DRAM industry, time is measured in quarters. So my seat is going to be warm for quite some time.
I'm hoping for a good solid shareholder lawsuit, and I expect to see it before the end of the year. It amazes me that none have been filed yet. I'm looking forward to the legal bankruptcy of Rambus. That won't happen for years, but I think it's likely. I'm looking forward to Samsung announcing (or admitting) that they're cutting back on RDRAM production, that should be sometime in the first 6 months of next year.
I've been on this thread for about 2 years, and I have about 2 more years to go.
-- Carl |