SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : ATM vs. Gigabit

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: George Dawson who wrote (32)10/24/1997 5:42:00 AM
From: Network Guru   of 63
 
Without delving into the numbers, you have to remember that the latency on a 100Mbps switch will be (typically) one tenth of that of a 10Mbps switch. Since the packets are read into and out of buffer memory at ten times the speed this makes sense.

By the same token, a gigabit switch will (typically) have a latency of one tenth that of a 100Mbps fast ethernet switch and one hundredth the latency of a 10Mbps ethernet switch.

Obviously different vendors switches have different processing times (the time it takes the switch to do the address look up when the data is in the buffer and decide which port to forward the packet out of). However processing times are measured in clock cycles which are pretty damn small.

The bottom line is that switching latencies of nanaseconds are really not a surprise....but to be honest....they all but irrelevant these days. The fact is that even for sensitive multi-media data, 40 nanseconds or even 400 nanseconds.......your eye/ear just can't tell the difference. The bigger issue is actually jitter. Jitter is the variation in latency between packets.

Think if you watched a video stream thru a switch. Imagine if every packet of video data was delayed by (for example) - 5 seconds. Sounds like a long time doesn't it ? But as long as EVERY packet was delayed by exactly 5 seconds, you would see a smooth video image on your screen - albeit delayed by 5 seconds from when it was transmitted ? However, if every packet had variable delay - some delayed by 5 seconds, some by 2 seconds, some by 6 seconds etc. etc. - you can quickly see that the video image you would see on the screen would be pretty flickery and unwatchable. Now in reality these delays are actually very small (of the orders discussed previously) but the variations in latency (the 'jitter') can still cause mutli-media issues especially when the multi-media stream contains synchronised data such as voice and video.

Bored of all this latency talk now.....I'll shut up.

NG
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext