SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Rambus (RMBS) - Eagle or Penguin -- Ignore unavailable to you. Want to Upgrade?


To: tinkershaw who wrote (75130)6/29/2001 12:45:18 AM
From: Dan3  Read Replies (1) | Respond to of 93625
 
Re: the price disparity is small enough between an RDRAM and DDR system

But RDRAM wasn't really supposed to be faster (and it isn't) it was designed to be cheaper.

RDRAM was conceived under the premise that pins were very expensive and silicon was very cheap and getting cheaper. Silicon, as expected, is indeed very cheap and getting cheaper fast, but pin costs have seen similar cost reductions, which was not expected.

RDRAM isn't fast memory at all, it's actually slower than SDRAM 133 - what it does is to compress a very wide bus onto relatively few pins. Almost all current Rambus systems use two RDRAM channels which are, on the chip, 128 bits wide, resulting in a 256 bit path to slow memory. In fact, it actually acts more like two interleaved 256 bit busses. The problem for RDRAM is that squeezing 8 bits from each read onto a single pin takes time: the bits must be collected, parity generated, sent, decoded, and parity checked at the other end. This adds a delay for each read that becomes more and more of a problem as chips get faster and a delay of a given length wastes more CPU cycles.

DDR and SDRAM are just plain old memory, but fast memory, that is directly connected to the memory controller. Each time cells are read the data just goes straight to the CPU, so a 64 bit read requires 64 data lines, and their associated pins. But the CPU sees the results of that 64 bit read right away, without waiting for the serialize/deserialize step. A dual channel DDR motherboard needs 128 data paths to the memory controller, which sounds much more expensive than the Rambus approach - but pins and traces on boards have gotten much cheaper, and this has come as something of a surprise to Intel.

It would really be more accurate to say that the DDR / SDRAM approach is the high cost / high performance memory connection - not RDRAM. But that the performance of DDR / SDRAM is good enough so that cheap versions of it can be used where only high end (dual channel) RDRAM is adequate. Single channel 64 bit DDR is outperforming dual channel 256 bit RDRAM in many applications.

Regards,

Dan



To: tinkershaw who wrote (75130)6/29/2001 1:23:17 AM
From: dumbmoney  Respond to of 93625
 
I know I am thoroughly over-educated. Law degree as well as an MBA degree from Duke (freshly minted) that concentrated in marketing and strategy.

If it helps, I do not think you are over-educated.



To: tinkershaw who wrote (75130)6/29/2001 2:52:23 AM
From: Bilow  Read Replies (2) | Respond to of 93625
 
Hi tinkershaw; So it all gets back to the question of RDRAM performance...

Answer this one question: "Why didn't anyone besides Intel decide to use RDRAM?" In other words, why didn't VIA, AMD, SiS, Serverworks, ALi, IBM, Nvidia, ATI, PMCS, Apple, etc., use RDRAM? Why did Intel decide to use DDR?

The problem for you here is that in order to understand the performance features of memory systems you pretty much actually have to have a job designing them.

That is, it is not possible for a lawyer to be sure what is a better solution, even less than it is possible for me to be sure that a certain way of approaching a jury is less likely to result in a positive verdict. I can listen to the arguments from the "experts", but until I actually am in the business, I really won't know, and I can't even figure out who how expert (or truthful) the experts are. All I can do is poke and hope.

In other words, by making an investment based on your understanding of engineering, you've basically screwed up. There's a reason you give legal advice and I design memory systems, and it isn't because you're a better memory designer than me. But don't give up on RMBS yet, there is a chance that Rambus will win the legal tangle, as far as I know. But then again, I'm the engineer, LOL!!!

I know that sometimes people think that engineering is a trivial occupation where you simply hook up tab A to slot A, but the real facts couldn't be farther from the truth. Rambus did real well at selling you stock, but they didn't do too well at selling their technology to the industry, except briefly to Intel, Nintendo and Sony. Nintendo rejected RDRAM for their next design, and Intel is busy working on DDR. Sony's PS/3 is supposed to have embedded, and if this holds, and I'm sure it will, Rambus will be zero for 3 for keeping clients. And that's only their record at clients that actually used RDRAM. Their record among companies that had to can their RDRAM projects is worse.

The reason Rambus was able to sell you stock is because they based the sale largely on engineering arguments that seemed plausible, even obvious, but were misleading for reasons that you weren't sophisticated enough to understand. Why do you think that when lawyers argue over patents they bring in "experts"? If lawyers were so smart, they'd be experts on everything. The problem with Rambus is that those same engineering arguments that you swallowed hook line and sinker failed miserably in industry, except for the three victims listed above.

Can I show you why it is that the Rambus solution is inferior? No. Hey, I've been on this thread for 2 years and I damn well know that arguing with you over engineering details is useless. You have nowhere near the background to understand the issues. In fact, most engineers don't have the background. Probably zero scientists with PhDs in physics or engineering have the background, not even the "rocket scientists", as they're not familiar with the commercial environment. (Look at PTNewell's claims that DDR266 would never work.)

In order to figure this little conundrum out, you're going to have to spend a half decade or so designing memory systems for mass manufacturing in a commercial environment yourself, LOL, and you're probably making too much money to contemplate a change of career. Not only that, but no one would hire you without already having experience doing some sort of design for manufacture. Newell, for instance, undoubtedly has experience making prototypes, but that doesn't cut it in the commercial world. In order for you to gain the experience needed to judge the merits of alternative commercial memory system designs, you're going to have to work your way up slowly through the industry, and make the same mistakes that all of us in that industry have already made.

They don't teach these skills in college. You can't learn practical digital design, much less commercial memory system design by going to MIT for 4 years. Guys with EE degrees have to be retrained once they reach industry because industry requires different results than what is required to pass classes, or even get good grades, in college. Industry requires designs that are immune to problems despite being produced in vast volumes by sloppy manufacturing techniques with almost defective components. Industry requires designs that get past FCC tests, and pass signal integrity checks. Industry requires worst case design. Industry requires designs which meet these requirements, and where every penny is pinched, but not a single penny too far. College requires none of this.

These are all subjects of which you know zero. Until you've had to show up at the manufacturing floor where you're responsible for fixing the disaster of the day, (say a 2% defect rate in finished product test) you just don't know what this industry is like. Look around, you'll find plenty of people on the net, claiming expertise in all of the above, who will tell you that Rambus is the only or best solution. If you want to believe them there is nothing that I can say that will convince you otherwise. The only thing I can say is that they're obviously not working as design engineers using RDRAM.

At least they're not designing RDRAM memory systems at AMD

Or at VIA,

Or at ALi,

Or at SiS,

Or at IBM,

Or at PixelFusion, (at least not anymore)

Or at Sun, (at least not anymore)

Or at Nvidia,

Or at PMC-Sierra,

Or at Serverworks,

Or at Nintendo (at least not anymore),

Or at any other major company other than possibly Intel and Sony.

I could go on. But do understand that my carrying on a discussion of the technical merits of DDR and RDRAM with you is a waste of time. This is too subtle of an issue. If it were obvious, you wouldn't have already swallowed the bait, you're not stupid, you're just unfamiliar with this neighborhood. You're a tourist, you don't even speak the language, how are you possibly going to understand the "legal system" of this strange land you're visiting? Do you really want to invest based on your understanding of this place?

Rambus was able to convince the management at Intel that they had the solution, but that was because the management at Intel, just like typical management everywhere, had an imperfect understanding of the issues. Intel management wasn't designing the chips. They were managers. Maybe one or two of them used to design memory systems, but I doubt it. If they did have such experience, it was in the past, and this industry moves fast enough that old knowledge gets devalued quickly. Intel forced Rambus onto their engineers, and their engineers have been squealing about it ever since. (See #reply-15761962 for a link to a trade press article "Management and engineers do battle over Rambus DRAM" describing what happened at Intel.) Not all their engineers. But not all their engineers are too bright. People, even engineers, obey their managers. If you threaten to fire them, they obey, if they're sitting on a pile of stock options too fat to abandon.

If management wants to have something done in a stupid way, all they have to do is go around asking if anybody thinks it's a good idea. Eventually an engineer stupid enough to agree with them will step up to the plate, and say he can get it to work in 9 months. Management is most successful when they don't overule the objections of their engineers, but if you get a degree at a fancy enough college, it's pretty easy to convince yourself that since you're smarter, your opinion on any given subject is likely to be the correct one. In general, this is true, but not when you stick your nose into a complex field against the true experts.

[Funny, the above paragraph also applies to how Rambus ignored their own legal counsel when they were in JEDEC (who said to get out immediately or lose their SDRAM patents), and then had to shop for a law firm stupid enough to go to court in Virginia for them, LOL!!! If you knock on enough doors, eventually you find a law firm incompetent enough to agree that you're in the right, and voila, a fraud verdict. And as long as I'm on this line of reasoning, Rambus' whole argument that the industry must have stolen SDRAM because only Rambus was smart enough to invent it smacks of the same incompetent elitism.]

You can read about the Intel case in the trade press article linked above, but you guys don't believe those articles, do you. Instead, you guys believe there is a secret cabal of Rambus haters out to get you to sell your RMBS cheap, (LOL!!!), so you ignore things like that. Out in industry, we're finding this whole situation hilarious. Sony and Nintendo's gaffes were undoubtedly similar to the Intel case, but the details haven't been leaked all over the industry.

But answer this one question: "Why didn't anyone besides Intel decide to use RDRAM?" In other words, why didn't VIA, AMD, SiS, Serverworks, ALi, IBM, Nvidia, ATI, PMCS, Apple, etc., use RDRAM? Why did Intel decide to use DDR? Why did Nintendo dump RDRAM for their new design?

I really want to know. Why isn't there an Apple computer that uses RDRAM? Why are there no Nvidia RDRAM chipsets announced? Where are they? Is it just that the engineering companies of the world (except Intel) really are that stupid?

-- Carl