Hi Dan3; Actually, I wouldn't say that any use of rambus in things like video cards, gaming machines, and network cards would be short lived - since it is always both higher in performance and lower in cost to embed the memory directly onto the game chip, or video chip, or whatever. Just that as time goes on, it has become more and more profitable to embed DRAM.
There are other alternative technologies as well.
By the way, that 3.2GB/sec memory interface by NEC (?) that is implemented with DDR must have been 200MHz DDR, 32-bit wide chips. This has been telegraphed by industry for some time. Half the speed per pin of RDRAM means twice as many pins required, but the savings in reliability, royalties, and power consumption probably make it worthwhile.
By the way, I came upon a post that looked to me like it accurately described the Rambus technical problem. I posted a copy of it over on the AMD/INTC/RMBS &c., thread, but it really hasn't gathered much comment.
Basically, the claim is that there is an output current drive compliance problem. Reflections on the rambus data channel cause the "launch voltage" to differ according to previous bus activity. Some chips have trouble driving the bus due to the signals from other chips.
If this is the problem, and, if not, it seemed like the best faked technical leak I've seen in a long time, then rambus will never get 3-RIMMs to work (without reengineering the memory chips,) and will probably, in addition, have field failures in the 2-RIMM boards as well (in my opinion).
-- Carl |