SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Rambus (RMBS) - Eagle or Penguin -- Ignore unavailable to you. Want to Upgrade?


To: NightOwl who wrote (48577)8/1/2000 12:37:09 AM
From: Estephen  Read Replies (3) | Respond to of 93625
 
"But I see Sony's decision to go with DRDRAM in the face of DDR's capability and cost advantage as based on two factors:"

A major reason sony didn't use DDR is because it's UNSTABLE.
Thats why all the delays. Intel doesn't believe it's even feasible for the desktop. It's also interesting that micron is still in the experimental (development they call it) stage on ddr micron.com.

The performance claims and cost of DDR, from the anti-rambus crowd is so distorted as to be fictious. We know how well the anti-rambus press can distort the truth, they do the same when hyping ddr.
DDR is purely vaporware. I have predicted several times that it will never be in the desktop. If it does make it, it will prove to be a short lived, dismal failure.



To: NightOwl who wrote (48577)8/1/2000 1:04:21 AM
From: Estephen  Respond to of 93625
 
I love ddr. Don't get me wrong, it pays a higher royality.



To: NightOwl who wrote (48577)8/1/2000 3:05:52 AM
From: jim kelley  Read Replies (1) | Respond to of 93625
 
It is too early to tell what the future of DDR as main memory is since there are no products yet available. Server memory alone only accounts for 8 to 10% of DRAM utilization hence even if DDR captures 100% of the server market it will be a niche product. DDR must capture the desktop market to become dominant. RDRAM clearly can be effectively used in low end servers and no doubt will be so used.

DDR does not scale well since it is essentially a clocking scheme and does not include as part of its specification control, address and data termination. This is left to the implementer. As the clock rate increases, DDR-II falls prey to the same issues of bus terminations already solved by RDRAM. Moreover, the bus has 4 times as many signal traces which compounds the termination problem. RDRAM is the correct approach to scaling in clock frequency. Servers can afford the cost of a few extra layers of motherboard but desktop systems can not. The 820 is a 4 layer board.

Since both DDR and RDRAM use the same dram cells it is the architecture of the chip that that determines the differences in bandwidth and latency. RDRAM provides a high speed terminated packet bus. DDR offers a parallel raw bus that does not scale well as the clock rate increases. DDR SDRAM has been around since 1996 and has not found application in main memory despite many attempts over the years. It has found application as a graphics memory in the form of SGRAM. now we are being told that DDR systems are around the corner (Q4) but is unlikely that they will emerge in volume before q1-01 because of the risks associated with the launch of such products during the peak consumer business months. The P4 will effectively use the bandwidth provided by RDRAM with its new 400Mhz FSB. While there are significant concurrency benefits using RDRAM in P3 systems with 100 and 133 FSB it has been known for more than a year that the 400 Mhz FSB is the ideal match for RAMBUS. Most the criticism of RAMBUS has centered on cost and realized performance with P3 systems. Both these issues will dissolve with the launch of the P4. Even the critics know this and are loathe to admit it.

The die area of the second generation of RDRAM devices will be within 5% of Sdram (as per Samsung). The rest of the cost equation arises out of packaging and economies of scale. Thus it is not really possible to assert that RDRAM will be more expensive than ordinary Sdram in 2001 and 2002. This really depends largely on economies of scale (volume of production) and the specific process used to make the die.

The main reason Intel announced an Sdram version of Willamette for the second half of 01 is to cut off AMD and VIA as well as the other chipset companies. As you recall VIA has been thumbing its nose and saying that it will build a DDR chipset for the P4 without a license. AMD has been touting DDR and Intel is intent on denying them any distinction in the market place.

Finally, it is becoming increasingly clear with each new license agreement that RAMBUS will be collecting royalties from VIA, AMD, and all of the other companies. There may be a few lawsuits and ITC cases arising out of this but clearly RAMBUS has the patents and it has a fiduciary duty to collect on those patents for the benefit of the shareholders.

People are comparing DDR-II to RDRAM instead of comparing DDR-II to QRSL RDRAM. The comparisons have been picked to meet a political agenda. Still there are no DDR systems much less DDR-II systems on the market. Sure there are graphics cards that use DDR SGRAM but so what.

I expect that all the Japanese companies will be licensed by December. All the Korean companies should be licensed in 2001. The Taianese company will take longer but they will come around. Infineon will also no doubt take a license.
Companies that do not license will be at a competitive disadvantage in getting oem contracts and oems which use unlicensed components will face having their shipments interdicted.

RAMBUS is here to stay and their technolgy is getting more interesting with each new product announcement.

JMO



To: NightOwl who wrote (48577)8/1/2000 3:06:47 AM
From: jim kelley  Read Replies (2) | Respond to of 93625
 
It is too early to tell what the future of DDR as main memory is since there are no products yet available. Server memory alone only accounts for 8 to 10% of DRAM utilization hence even if DDR captures 100% of the server market it will be a niche product. DDR must capture the desktop market to become dominant. RDRAM clearly can be effectively used in low end servers and no doubt will be so used.

DDR does not scale well since it is essentially a clocking scheme and does not include as part of its specification control, address and data termination. This is left to the implementer. As the clock rate increases, DDR-II falls prey to the same issues of bus terminations already solved by RDRAM. Moreover, the bus has 4 times as many signal traces which compounds the termination problem. RDRAM is the correct approach to scaling in clock frequency. Servers can afford the cost of a few extra layers of motherboard but desktop systems can not. The 820 is a 4 layer board.

Since both DDR and RDRAM use the same dram cells it is the architecture of the chip that that determines the differences in bandwidth and latency. RDRAM provides a high speed terminated packet bus. DDR offers a parallel raw bus that does not scale well as the clock rate increases. DDR SDRAM has been around since 1996 and has not found application in main memory despite many attempts over the years. It has found application as a graphics memory in the form of SGRAM. now we are being told that DDR systems are around the corner (Q4) but is unlikely that they will emerge in volume before q1-01 because of the risks associated with the launch of such products during the peak consumer business months. The P4 will effectively use the bandwidth provided by RDRAM with its new 400Mhz FSB. While there are significant concurrency benefits using RDRAM in P3 systems with 100 and 133 FSB it has been known for more than a year that the 400 Mhz FSB is the ideal match for RAMBUS. Most the criticism of RAMBUS has centered on cost and realized performance with P3 systems. Both these issues will dissolve with the launch of the P4. Even the critics know this and are loathe to admit it.

The die area of the second generation of RDRAM devices will be within 5% of Sdram (as per Samsung). The rest of the cost equation arises out of packaging and economies of scale. Thus it is not really possible to assert that RDRAM will be more expensive than ordinary Sdram in 2001 and 2002. This really depends largely on economies of scale (volume of production) and the specific process used to make the die.

The main reason Intel announced an Sdram version of Willamette for the second half of 01 is to cut off AMD and VIA as well as the other chipset companies. As you recall VIA has been thumbing its nose and saying that it will build a DDR chipset for the P4 without a license. AMD has been touting DDR and Intel is intent on denying them any distinction in the market place.

Finally, it is becoming increasingly clear with each new license agreement that RAMBUS will be collecting royalties from VIA, AMD, and all of the other companies. There may be a few lawsuits and ITC cases arising out of this but clearly RAMBUS has the patents and it has a fiduciary duty to collect on those patents for the benefit of the shareholders.

People are comparing DDR-II to RDRAM instead of comparing DDR-II to QRSL RDRAM. The comparisons have been picked to meet a political agenda. Still there are no DDR systems much less DDR-II systems on the market. Sure there are graphics cards that use DDR SGRAM but so what.

I expect that all the Japanese companies will be licensed by December. All the Korean companies should be licensed in 2001. The Taiwanese companies will take longer but they will come around. Infineon will also no doubt take a license.

Companies that do not license will be at a competitive disadvantage in getting oem contracts and oems which use unlicensed components will face having their shipments interdicted.

RAMBUS is here to stay and their technology is getting more interesting with each new product announcement.

JMO