To: ahhaha who wrote (18120 ) 12/21/1999 11:12:00 AM From: Jay Lowe Read Replies (3) | Respond to of 29970
>> based on observation, not on principle Indeed? Can you give one, single example of an information processing resource which has not been capacity-bound? Oh, wait, that's the empiricist line. The principle is that with Excel, email, VoIP, and streaming video we are still chipping flint and cannot even begin to imagine how we will evolve. This wheel of yours is going to revolutionize transportation, I'm sure of it. But I also expect that wizards and merchants alike will invent loads beyond it's capacity. Terabit isn't infinite. It's not even "big" in any absolute sense. It's only "big" relative to past, expired, over-with, done, deceased, obsolete human experience. >> tbit transfers FTTD are not too far away How many tbit transfers to how many desktops? And how far is far? You throw these things around as if they were absolutes ... ignoring the entire evolutionary nature of the process ... the K * log(e)**pT wash-in process. ... and to whose desktop? The most common name in the world is Mohamed. I'm thinking about the economics. >> I challenge you to figure out how to use that width. Er, real HDTV community telepresence? Um, distributed genomic simulation? Give a minute or twenty, I could think up half a dozen applications which would suck up a terabit link in a hot second. And then I would be right back to the priority issue. I agree there are phases. 20Gb disks on desktop machines take a long time to fill up ... that was a recent 10x leap ... so it'll take awhile to obsolete that level of capacity. FTTD won't happen in the general case overnight, nor will it happen in the pervasive case for 5-10 years. Instead, incremental improvements will happen ... 10x here ... 100x there ... leaps of 1000x once in a while. If you can make a picture of pervasive terabit bandwidth being unchallenged ... just switch your timescale to include a 10 -20 year lookahead ... and tell me if it's still unchallenged.