SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Strategies & Market Trends : Gorilla and King Portfolio Candidates

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: buck who wrote (18375)2/22/2000 9:03:00 PM
From: om3  Read Replies (2) of 54805
 
>> I have a link to a CacheFlow white paper that explains the math behind network-induced latency that you address. In your example, you have delivered one request to the web server. There are many more to go. <<

Buck,

Thanks for an excellent analysis of content delivery! I'm a bit skeptical, though, of the huge latency calculated in the CacheFlow white paper you referenced at:

cacheflow.com

They claim that if a web page has 45 objects and is retrieved across the country, then there is a minimum delay of ((45+1)*2)*5000/100,000=4.6 seconds. This assumes that each object requires its own HTTP connection to be established and that each object must be completely received before the next one is requested.

According to Erik Wilde's "Wilde's WWW, Technical Foundations of the World Wide Web", page 97, these two issues were resolved with HTTP 1.1. This protocol supports "persistent connections" which means that a client only has to open a single connection and can then send multiple requests over it, rather than having to open a separate connection for each request. It also supports pipelining which means that a client can start a second request before the first request has finished. Thus I think a browser should be able to open a single connection with one round-trip latency, then send 45 download requests right after one another and get them all after only a second round-trip latency for a total delay of only 2*5000/100,000=.1 seconds. Am I misunderstanding something in their argument?

--Steve
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext