SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : MSFT Internet Explorer vs. NSCP Navigator -- Ignore unavailable to you. Want to Upgrade?


To: Charles Hughes who wrote (21676)11/23/1998 11:32:00 AM
From: Gerald R. Lampton  Read Replies (1) | Respond to of 24154
 
You can definitely lower prices, and benefit the *average* consumer for a time, while absolutely wiping out all competition.

That's right, and that's what Microsoft has been able to do as a result of network effects and "lock in" as described in the government's complaint. It's called a natural monopoly.

I will leave the case that this is ultimately bad for the consumer for another time.

No. I want to hear it now.



To: Charles Hughes who wrote (21676)11/23/1998 2:02:00 PM
From: rudedog  Read Replies (3) | Respond to of 24154
 
Chaz -
I think the important distinction is between competition which gets eliminated because the competitors fail to understand the dynamics of their industry, which include both technical and business dynamics, and those that get eliminated by some unfair practices, who otherwise would have survived to provide better and more innovative products.

In that sense the DOJ's ground was poorly chosen. From the beginning of software, the tendency has been to find the most efficient, lowest resource path between the function desired and the result. Probably the first real programmable machine was Whirlwind, and the 'programming' was the replacement of physical changes in the system with an abstract layer which simulated those key control changes. It was of course many years before the additional abstraction of a machine-dependent layer, which serviced common requests for hardware, and an independent layer which operated against virtual requests for those services, became a reality. This was the step which created what McCracken called the 'perfect abstraction' of the programmer's world we know today, where a given piece of logic executes in exactly the same way, every time, given that the state of the machine is known at the beginning of execution.

A good analogy for understanding how the operating system can and should assume the properties of especially powerful applications, and why the process of creating new applications which 'stretch the box' are so important to the development of systems architecture, can be found in the development of database engines.

The notion of a general purpose operating system was well advanced before modern database engines hit the scene. Operating systems not only abstracted access to hardware (disks, consoles etc.) but also presented abstract file systems which organized the information in a generalized way that different applications could use.

But database applications had no use for common data stores - they needed to own their data absolutely. Also they needed to know the state of every write to assure data integrity and needed to know the sequence of operations to control journaling. As a result, the early database engines were really mini-operating systems. They went around the file systems, to get direct control of the disk. They went around the schedulers, to assure deterministic sequencing. The resulting systems got the job done but they were large, and the efforts to port the engines were very complex, requiring a deep understanding of every piece of hardware for which the systems were targeted.

Individual UNIX vendors responded to these requirements in a variety of ways, but virtually all of them moved to asynchronous disk access routines which allowed a deterministic handshake between the application and the disks to assure in-order execution, and provided lower level thread management to allow multiple parallel threads to execute concurrently. The result was that more than half of the code in the original database engines moved into the OS.

A secondary effect was that a detailed knowledge of the machine internals was no longer required to write a good database engine, and smaller, faster, more flexible engines, which could be ported more easily (or in some cases required no porting at all) were developed.

The parallel with browsers is this - browsers were originally developed at a time when the world-wide web was a brand-new concept, when the ability to support multi-media constructs and advanced graphics was not really comprehended in operating systems, and, most importantly, when the internet was an 'external thing' that no other applications or operating systems components knew about. As a result, browsers did all of that interface, and owned access to the net.

But the internet as an abstraction has become more and more a part of every computer operation. Internal and even local web servers do help menus, applications links and data management, usually invisibly to the user. Increasingly, the modern user interface includes most of the components of redirection, application and device independence and generalized messaging that were once relegated only to a single arcane external link.

In order for an operating system to provide hosting for this new class of design, most or all of the components of the early browsers need to be supported for every class of application. Applications designers have less need to separate out the class of component or the location or even service domain of resources they intend to use. This is good, it leads to more general applications frameworks and smaller. lighter-weight applications.

But the other consequence is that the browser itself becomes just one of many possible windows into this messaging infrastructure, and a more and more trivial one at that. The integration of IE into the windows environment (which by the way was hardly the first such integration, IBM was further along with some OS/2 variants, and Solaris also does this) makes the browser a minor afterthought with little economic or utility value in itself. Browsers as a separate thing are quickly going to disappear, and the internet as a separate thing has already started to be subsumed into the general fabric of intranets, extranets, and a host of other service domains.

In this sense, the DOJ is fighting the last war. It doesn't matter what happens to browsers. If any of the participants in the current farce ever get the current business direction or technical issues on the table, that will become obvious. But I don't expect that to happen, all of the tech players on both sides have a lot to lose and little to gain if the discussion moves into that terrain.