SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Rambus (RMBS) - Eagle or Penguin
RMBS 94.21-1.1%3:52 PM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Bilow who wrote (79524)11/2/2001 7:48:43 AM
From: NightOwl  Read Replies (1) of 93625
 
CB,

I need a EE Expert opinion on this "comment" taken from eetimes.com :

Advanced Micro Devices Inc., for its part, plans to take a swipe at DRAM latency by integrating a double-data-rate memory controller in its next-generation Hammer processor, the company's first 64-bit X86 processor. The controller supports 8- and 16-byte interfaces.

Most of the article appears to be discussing L2/L3 caching schemes, but seems to switch to main memory issues at the end where the above comment is located. The question I have is whether this statement, if true, means that all AMD's Hammer CPU's will require no external memory controller in the chip set?

If that's true, can we assume that the same controller Hammer will serve cache memory as well? Will there be no cache memory? Will the CPU have two memory controllers?

If this embedded controller is serving main memory directly why the heck would the interface top out at only 16 bytes/128bits? Shouldn't it include 256bits for the latest, greatest, fastest DDR SDRAM? Or is the 128bit max dictated by the 64bit CPU and 64x2 bit DDR bus?

Could they go to a 256bit interface with QDR-DDR to keep a match?

0|0
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext