... although, it's apparent that not all writers read white papers with their morning coffee. From the December 4 issue of Barrons, journo Bill Alpert writes:
"Unfortunately, all that bandwidth is not good enough. Internet traffic is doubling every 100 days."
Actually, according to some observers, if "actual traffic" were measured as a function of goodput multiplied by distance (as measured in units of terabit-miles, say), they would argue that traffic is not really growing that fast at all. Some even go so far as to say that it will someday level off, or retreat from the trend line that it had taken for the past four years. Both views are fostered by the the increasing presence of local and metro-level caching and web acceleration schemes. None of this has anything to do with the actual number of packets sent, which are obviously going to continue their logarithmic ascent. Rather, these views center on the relative amount of traffic sent, as measured by sent packets times mileage traveled.
Sound familiar, Graciella? Which intranet that we have been know to discuss in the past is based on the principle of keeping latency down through the extensive use of caching? And this is the road that other multimedia content distribution network (CDN) providers will be taking, too.
Comments, anyone?
FAC |