To: jim kelley who wrote (41966 ) 5/10/2000 5:36:00 PM From: Bilow Read Replies (2) | Respond to of 93625
hi jim kelley; Re power consumption of DDR vs RDRAM... Here's a picture of a typical 333MHz GeForce2 graphics card. It has a 128-bit wide memory bus. It uses four x32 DDR SGRAM chips. This provides 5.3GByte/sec bandwidth, more than could be obtained from 3 Rambus chips at only a much higher cost. Since there is only one bank, all of these chips are used every time memory is accessed, so their power consumption is the maximum that a DDR system can achieve. (The power consumption is something like 300mA per chip, max, well under the amount that an RDRAM sucks.) Note that there is no heat spreader on these DDR memory chips. (They undoubtedly benefit from increased circulation due to the GPU fan.) This is in distinction to RDRAM solutions, which invariably have something metallic attached to the RDRAM chips in order to conduct heat away. Also note that the concentration of resistors near the memory chips is quite low, and extensive areas of PCB (in the upper right corner) have been left bereft of both passives and vias. This is obviously not a PCB that required a lot of layers to route.www7.tomshardware.com The same reasoning applies to DIMMs and RIMMs. The DIMMs are not equipped with heat spreaders, the RIMMs are. It really doesn't take a rocket scientist to figure out the consequences: (1) At the small memory end, individual DDR components require no special cooling, individual RDRAM components do. (2) With mid size memories, RIMMs requires heat spreaders, DDR DIMMs do not. (3) With high end server memories, RIMMs require heat spreaders, DDR DIMMs do not. Make as many errors in your power consumption calculations as you would like, the fact is that DDR designs are being shipped without special memory cooling techniques, while no RDRAM systems are shipping without RDRAM heat spreaders. -- Carl