To: gnuman who wrote (57824 ) 6/12/1998 11:32:00 AM From: Harvey Allen Respond to of 186894
This growing complexity is a symptom of a larger problem. The decisions that the computer industry's main players make are based on several key assumptions that are well over two decades old: The Uniprocessor Assumption Today's PC architectures were designed over two decades ago, in an era when microprocessors were expensive. Today, microprocessors are commodities. Yet, we're still using operating systems that can only run on one processor, even though multiprocessor configurations would deliver a new level of price- performance. As a result, products deliver less performance than is inherent in the hardware, at prices that are higher than they should be. And even if the customer of an application uses a uniprocessor machine, the developer of that application, working many months earlier, would benefit from developing on a multiprocessor machine that provides an equivalent level of performance to next year's customer's uniprocessor machine. OS Assumptions: Aging DNA Many of the capabilities required by digital media simply weren't conceived of when today's PC architectures were created. Place yourself in the role of a systems designer twenty years ago. Real- time, high-bandwidth media? Interactive manipulation of megabytes of data? The Internet? Theories, maybe. Reality, not. The result is a cost and performance drain created by the need to bolt, shove, and cram these new pieces of functionality into architectures that simply can't handle them. Programming models are so riddled with rules and exceptions that new applications take years to come to market and consume more and more memory and processing power. The industry is attempting to deliver the next decade's solutions using architectures designed to solve the last decades' problems. The Compatibility Assumption Each year, advances in hardware significantly improve the speed - - and cut the costs -- of computer processors and peripherals. But the end user sees only a small fraction of the increased power delivered by new hardware. Software developers know what's happening: The power that's delivered by new, faster hardware is being lost in a morass of software overhead created by the "need" for backward binary compatibility. Backward compatibility isn't a bad thing -- but are we getting our money's worth striving for binary compatibility when network and data compatibility are far more important? The standards that increasingly matter today -- HTML, VRML, JPEG, MPEG, MIDI -- don't involve binary codes. These are the standards that determine interoperability in a net-connected world. be.com