SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : AMD/INTC/RMBS et ALL

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Charles R who wrote (101)9/22/1999 5:22:00 PM
From: grok  Read Replies (4) of 271
 
Charles,

I think that Rambus' advantages come out in the granularity/pin_count area. Actually why else would you take 16 bytes and funnel them over a 2 byte bus? If you knew that you were designing for a 64 bit data bus I can't see any reason to use the Rambus approach, can you?

So in attempting to provide one dram interface for the entire market they had to adopt a narrow bus and then try to establish it everywhere. So applications without granularity/pin_count concerns get penalized with the serialization delay which directly hurts latency.

But let's look at granularity/pin_count. Twelve years ago it looked like granularity would soon be a big problem many places. At that time drams were mostly 1 bit wide and just moving to 4 bits wides. Dram density was on a straight line towards 1 GBit as the prime dram in 2000. Main memories were small! At that time the typical Sun Workstation shipped with 8 MB and the Sun OS group had a project underway to get minumum memory down to 4 MB! (This was two dram shortages ago.) At that time drams had just gone to 1 MBit. So a workstation would use dram in groups of 16 4*256KBit chips and memory would be variable in chunks of 2 MB.

By extrapolating the trends it looked like in 1995 you'd use 16 4*4MBit chips and min memory would be 32 MBytes which would be too large so granularity would have already arrived. This was the environment when Rambus was conceived.

However, a funny thing happened. Actually three funny things happened:

1. drams got wider even though twelve years ago most people said that it wouldn't happen.
2. The rate of dram density slowed slightly so that in 2000 the main memory will be 128 MBit or 256 MBit, not 1 GBit.
3. But the biggest change of all was that main memory has exploded. I have 256 MByte in my den.

All of these together have put the day when granularity hits the computer business off to the dim future. When will it happen? Say in 2003 the main dram is 1 GBit. In a x16 package a 64 bit data bus would have 512 MByte min memory. That does sound like a lot but it may not seem like a lot in 2003. Also, what if drams go to x32 by then? Then you've got two 1 GBit sdrams with 256 GByte min memory and I'll bet you that is just fine even for laptops.

Now for computers smaller than PCs (or those not using Microsoft software) granularity has already hit. So we see PSII with two Rambus channels with one Rdram on each channel. That seems like a good solution except cost and power. It seems like the actual cost difference between Rdram and Sdram is maybe 40%. Higher power is not so nice in consumer products either and when a game is underway I wonder if any of the Rambus power down modes will be usable. I expect that games will continue to be the best place for Rdrams but they'll need more volume than that or the cost difference will grow even wider.

Anyway, gotta go now.
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext