SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Rambus (RMBS) - Eagle or Penguin -- Ignore unavailable to you. Want to Upgrade?


To: Bilow who wrote (34401)11/13/1999 8:06:00 AM
From: John Walliker  Respond to of 93625
 
Carl,

John, did I get it right?

I think so, except I'm not sure about this. If the Rambus channel length is 5ns long, then the nearest RDRAM chips will be programmed to delay their writes by 10ns. I think that is worth double checking.

A very nice posting though.

John



To: Bilow who wrote (34401)11/14/1999 12:34:00 AM
From: Ali Chen  Read Replies (1) | Respond to of 93625
 
Bilow, in this case could you explain to me some signal situation please.

If you look at any diagram in DRDRAM chip info sheet,
you may notice that EVERY picture shows CTM and CFM
as the same clock, and even vertical dashed lines are
drawn to indicate beginnings and ends of BOTH command
AND data packets. This does look very weird to me
because in reality these two clocks are positioned
arbitrary relative each other, depending upon the
distance from the rambus controller, and the pictures
become much more complicated as compared to what
the "real engineers" seem to think.

When a chip receives a READ command (which it receives
with CTM clock), it must start to count CFM clocks
in order to place the output Q data in right bit-time,
forming the specified T(cas) delay. Therefore this
count has to begin sometime upon termination of the
command packet. Please remeber that those two events
belong to two different clock domains.

Now let consider a RDRAM chip that happen to be
approximately at the time-flight distance between
Transmit and Receive clocks of 2.5ns. In current
boards design guidance and 400/800 clocking, this
will happen somewhere at chips located at 1.25ns
distance, or at the end of first RIMM.

Now the question: how this T(cas) counter will start
to count in this situation? No matter how the
syncronization is done, there will be always
instances where the counter will skip the initial
tick (insufficient setup/hold time or too narrow
effective clock, and don't forget that every signal
has a jitter). As a result, the corresponding
Q-data packet sometimes will be placed in wrong
output clock, and the receiver (controller) will
get some bogus data at least, or at most the
whole sequencer will be screwed up until total
reset and re-synchronization of all DRDAM subsystem.

Please also note that due to thermal drifts the
whole group delay will vary for up to 6% over 20
degrees interval, so the exact misfortunate
condition will invariably happen sometime, and
maybe even on more than one chip. The effect will
be more severe at farthest chips, since with
longer distances those 6% of global variation
will translate into wider area where the two clocks
differ exactly by integer number of clock periods,
and the smaller variation are required to hit
the problem. Is not this the reason for the third
RIMM slot being eliminated?

In conclusion, the whole picture does not look
very encouraging from the technical standpoint.
The initial Rambus proposal shared the same lines
for transmission of control and receiving data.
In this "first generation" there was no turnaround
cycles, flight time seems to escape the attention
of developers, and only some relaxing protocol
could make it work in single-chip application.
I do not know about the "second generation", but
their "third generation" (in terms of the Paul
DeMone article) seems to still suffer from the
same problems: the "smart" clock going forth and
back does not solve the global synchronization
problem for distributed bus architecture. And the
"real engineers" continue to show pictures
where they see all clocks as perfectly aligned.

John, did I get it right ? :)

Disclaimer: I never saw or touch a RIMM in my life yet:)