SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Rambus (RMBS) - Eagle or Penguin -- Ignore unavailable to you. Want to Upgrade?


To: Bilow who wrote (79524)11/2/2001 5:42:52 AM
From: NightOwl  Respond to of 93625
 
You were the one.

Oh. Well, I may be that one; but I am certainly not The One that started Red Herring spinning because I'm not a fish Owl and I don't talk to them. <vbg>

Of course, this may simply be someone spinning without rhyme or reason. If we see two more such published references, then we'll know that The One has made its presence felt, and our search for true unity with The One can safely begin.

0|0



To: Bilow who wrote (79524)11/2/2001 7:48:43 AM
From: NightOwl  Read Replies (1) | Respond to of 93625
 
CB,

I need a EE Expert opinion on this "comment" taken from eetimes.com :

Advanced Micro Devices Inc., for its part, plans to take a swipe at DRAM latency by integrating a double-data-rate memory controller in its next-generation Hammer processor, the company's first 64-bit X86 processor. The controller supports 8- and 16-byte interfaces.

Most of the article appears to be discussing L2/L3 caching schemes, but seems to switch to main memory issues at the end where the above comment is located. The question I have is whether this statement, if true, means that all AMD's Hammer CPU's will require no external memory controller in the chip set?

If that's true, can we assume that the same controller Hammer will serve cache memory as well? Will there be no cache memory? Will the CPU have two memory controllers?

If this embedded controller is serving main memory directly why the heck would the interface top out at only 16 bytes/128bits? Shouldn't it include 256bits for the latest, greatest, fastest DDR SDRAM? Or is the 128bit max dictated by the 64bit CPU and 64x2 bit DDR bus?

Could they go to a 256bit interface with QDR-DDR to keep a match?

0|0