SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Rambus (RMBS) - Eagle or Penguin -- Ignore unavailable to you. Want to Upgrade?


To: Sun Tzu who wrote (73166)5/17/2001 7:08:16 PM
From: Dave B  Read Replies (1) | Respond to of 93625
 
ST,

The lost time that I was refering to was the time that got spent developing DDR. Even though Rambus does not make the chips themself, my guess is that if they had lowered their royalty payments, there would have been less drive to develop a competing technology.

There wasn't any lost time. Some vendors were going to support DDR regardless because they wanted to pay no royalty payments. You can't build much of a business if you're giving it away. <G>

Their mistake was, IMO, to stop there not garner support from others.

I agree -- Samsung seems to be the only other vendor strongly in their corner; maybe Toshiba. It was, unfortunately, a chicken and egg problem. Since RDRAM is more expensive to produce (test, etc.), many of the vendors took a wait-and-see attitude re: producing RDRAM. Samsung is really the only vendor that appears to have embraced RDRAM. If, by chance, RDRAM does become the next standard, Samsung is probably the only vendor that will make extraordinary profits off of their early production (I'm not sure if the same rule applies, but when I was consulting for one of the hard drive manufacturers, the rule of thumb was that all the profit to be made on a new capacity was made during the first six months -- if you delivered your version of that capacity more than six months after it first appeared, you made no profit on it.).

It seems to me that a great product, that is one which meets both the producers' and the consumers' needs from both economic and technological POV, does not need a heavy handed support from Intel.

In theory that would be great. But there will always be competition for standards, and someone is always going to think someone else is being "heavy-handed". Intel spent a year evaluating DDR, SLDRAM, and RDRAM before deciding to support RDRAM.

Why is it for example that Sun Micro did not choose to standardize on RDRAM?

As has been explained on this board by some of our resident experts, servers tend to lag behind desktop systems in adopting new memory standards. They pointed to servers still using EDO memory. However, as a counter example, I suppose, the Compaq Alpha chip is going to use RDRAM. There was an article in the last couple of days pointing out the irony of Compaq having to hope that RDRAM production goes up so that they can get chips for the Alpha systems, while at the same time hoping that it doesn't win at the desktop level (I think that was the gist of the article -- I just skimmed it).

What about all the other other non-PC products out there?

RDRAM is used in the PS2, as well as in networking equipment. It was announced that it would be used in HDTVs (I believe it was a Compaq/Toshiba deal) years ago, but I don't believe that any of the dozen or so HDTVs that have been purchased probably had RDRAM in them. The Rambus presentations I've seen at the annual meetings indicated that they believed about 2/3s of the RDRAM production would go into PC-like products (PCs, workstations, servers) and the remaining 1/3 into other devices. Until (If) the price gets down further, there's no pressure to put RDRAM in devices that don't require high bandwidth transfers.

RDRAM is not in a better market position not because Intel was slow to build a CPU for it, but because the memory makers did not want to build it unless they absolutely had to.

I agree, for the most part. There definitely was a lot of resistance to building RDRAM chips. It required new investments in test equipment, provided lower yields, et cetera. Only time will tell which MMs made the right decision.



To: Sun Tzu who wrote (73166)5/17/2001 7:58:39 PM
From: tinkershaw  Read Replies (1) | Respond to of 93625
 
The lost time that I was refering to was the time that got spent developing DDR. Even though Rambus does not make the chips themself, my guess is that if they had lowered their royalty payments, there would have been less drive to develop a competing technology.

DDR vs. RDRAM has little to do with royalty rates. Rather it has to do with the amount of investment involved in ramping up for RDRAM vs. ramping up for DDR and the steep learning curve involved. It is quite simply much more expensive to produce RDRAM. You need to invest in new equipment and you have to learn new processes to do so.

Some DRAMs were willing to make the investment, like Samsung. Others like Micron were much less willing to do so. Now companies like Micron know how far behind this has placed them in the learning and IP curve in comparison with Samsung, NEC, et al. They know that if RDRAM becomes the standard that they will be at a competitive disadvantage. Therefore the huge DDR war. It had little to do with a 1.5% to 2.5% royalty, which is all the royalty being charged on RDRAM.

Tinker