SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : LAST MILE TECHNOLOGIES - Let's Discuss Them Here

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: MikeM54321 who wrote (8870)10/13/2000 7:34:17 PM
From: Frank A. Coluccio  Read Replies (2) of 12823
 
Mike, to some degree the best-effort mentality that has been engendered by web existence has de-sensitized many users. So that, the absolute five-nines reliability criteria is not as important to many of today's users as one might imagine. For those applications that demand it, users will go the extra cost and implement it. Or put in the necessary backups to ensure it.

But for the majority of 'net-related purposes where high bandwidth is most useful to the non-financially impacted user (m-m, entertainment, video-conferencing), momentary glitches are common-place and more tolerable today than they were just a few years ago. It comes with the territory of the 'net, one could argue. This shift in tolerance, backed up by the huge savings that i-r systems purport to offer, may be a primary contributor to uptake in this space. In the end, it could come down to cost-benefit, and "making do" with less reliable versus not making do at all.

Momentary fog overhang, say, might be likened to heavy congestion on the 'net, where you wait an extra couple of minutes for that huge download (in the case of I-R, if the I-R system is rate-adaptive), as opposed to receiving it in a flash.

Since fog and background light interference can be tolerated to some extent by downshifting in line speed (there is an iverse relationship to the presence of these and throughput), I would imagine that those systems that use rate-adaptive schemes would fair better than those that do not.

In my own client experiences, when heavy snowfall or fog was present the client would "manually" (rate adaptation had not hit the I-R space yet, back then) go from T1 speeds down to some fractional T1, or down to a subrate like 56 or 128 kb/s. If the weather disturbance wasn't entirely obliterative (of line of sight), then this technique would often allow the client to ride out the storm, albeit at lower speeds.

The end result was longer transmission times, and overtime for the tech control staff. But the mail got through. At other times, when the fog was too dense (which only happened a handful of times, from my recollection), they had to wait it out until it lifted. But only on the longer shots.

Of course, in metro areas where banks and brokerages depend on reliable communications (and where the FED will hit you with a multi-million dollar penalty for reporting after deadline), there is no margin for unreliability. But in Silicon Alley where developers are at play all day long, marveling at their new media development work coming from their colo or host site, they could stand to wait a few extra minutes at times, or downshift to lower performance speeds.

FAC
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext