Hi all; RDRAM power problems... Some additions to the post that this one links to.
To a remarkable extent, this thread lives in "Rambus Land", that fantasy land where an ingenious new technology arrives out of the blue and proves to be better than previous technology in every way. This makes the stock purchase a slam dunk. Easy decision. Just buy, otherwise you'll miss the bus!
People here believe that Rambus has better signal integrity, better reliability, cheaper manufacturing, higher bandwidth, lower latency, longer and stiffer virility, and of course, less power consumption. Anything that goes against the Rambus Land happy talk is certain to be lies, broadcast by people with financial reasons to hate RMBS.
The above post on power consumption still hasn't been explained away. Power consumption calculations are complicated, and depending on how the system is designed, either Rambus or DDR could have the lower power consumption. Basically, RDRAM typically has the advantage in smaller memory systems, while DDR has the advantage in larger memory systems. But the simple notion that RDRAM uses less power is silly. RDRAM is well known in the industry to be a finger burner, that is why so much effort is put into the above Intel articles.
From an article on power consumption trends in PCs, which have been rising in recent years:
The CPU is not the only culprit, however. Power consumption is on the rise in chipsets, DRAM modules (especially Direct Rambus DRAM), and AGP graphics cards, the latter of which now ship with on-board fans of their own. techweb.com
I know, clearly techweb is horribly biased against Rambus. And those 500mA Idd specs in the Rambus data sheets on the Samsung website aren't the same units as the mA figures used in DDR. Rambus parts run on nature's own energy from the sun! They don't get hot! Whatever.
-- Carl |