To: JDinBaltimore who wrote (36462 ) 1/20/2001 9:19:29 PM From: smchan Respond to of 50167 Reduced instruction sets within the XML that generate algorithms based on the request. so a query is made, data is sent, based on the data described in the initial query it derives the original value of the algorithm. Then a second request for expanded query is sent the algorithms are compared, the software determines what additional data needs to be sent, and then only the additional information required to complete the request is transmitted. I'm getting lost on the algorithm part, but I think I follow: If we can assume that any two back to back queries from a user will result in data that, in some way, overlaps, we can optimize the transmission of those query results by only sending down the information which is new or changed. For example, if I query Yahoo! for "sports cars" and receive a series of hits, then it's probable that a query for "yellow sports cars" is an optimal subset of my original query. If I were to query for "red snapper" first and "salt water fish" second, it is also probable that my first query is a subset or at least has some elements which intersect with the second query's results. Both are pretty good optimizations, but I really don't see them solving a bandwidth problem by very much.The software realizes that all graphics, headers ect. are already sent, so only needs to send small amount of additional data. This is already being done. Check out Inktomi (INKT) and Akamai (AKAM) for 2 companies that provide caching services. It's a little different in that their caching is being done "edge of network" which translates roughly to being at major ISP points-of-presence. The downside is that graphics, etc are still being transmitted over that last bit of line between you and the ISP, but gains are made by caching the same for multiple users and keeping that data off the backbone for each and every hit. Check out CNN's site for example, you'll see they using Akamai to cache their graphics. In my opinion, browser's are... well... so 90's . Spiking bandwidth demands won't come from more users making more Internet queries. It will come from a more pervasive use of the Internet as it becomes a natural extension or appliance and less of something we plug our PC into. Imagine, for example, video and audio on demand services, integration of your telephone (or whatever personal communications device that replaces the telephone) such that local and long distance service is carried digitally, or integration of your cars' navigation system with your personal preferences database back home so it knows which exit in that strange city will yield a Denny's. :-) Take one look at Real Video and tell me we have satisfied, to the fullest extent, video on demand services on the Internet. Do the same with any live audio program and tell me we've satisfied, to the fullest extend, the demands of Internet radio. Man... I want nothing less than Digital Theater Sound coming across the 'net. I want 720p high-definition video on demand... and I want to pause it when I have to go to the bathroom! Forget about the browser. We're just getting started. Is it cool to get on a browser at work when you're coming home late and turn on the outside lights at home? Maybe... Would it be cool for you to command your car by voice to turn on the outside lights at home 20 miles away. Yes! Can you imagine the bandwidth demands of every car during rushing doing stuff like that? Bandwidth is like wealth. No matter how much you get, you will find a use for it. Sam