To: Boplicity who wrote (3747 ) 1/2/1999 1:10:00 AM From: ahhaha Read Replies (2) | Respond to of 29970
The house was one of ill repute. During the '50s the designers tried to create a house of the future. They spent $millions on it, equivalent to what Gates' house cost now. It was supposed to be automated and convenient. Obviously they didn't have the technology to get anywhere close, but they had enough to make it convincing, for a while. Then the problems started. When visitors came various combinations of systems and sub-sysems acted erratically. There were simple feedback control systems that were automated, but if one failed, whereas the others were designed to go it alone, part of what they did was predicated on inputs from the downed component. People would be in the automated bathroom and the door would suddenly shut locking them in. The toaster would shoot projectiles at people who entered the kitchen from the wrong door. The ice cube maker would not stop dumping ice cubes, falling on the floor freezing it, and causing temperature sensors to raise the heat. The excess heat would melt the ice cubes and set off fire alarms and other heat sensitive devices. After several years of struggling with this monstrosity, the designers realized that automation works best when systems are interconnected on a limited basis and then are disjoint, isolated, and organized to not be joinable. They found a max of three systems per dedicated application, what we would call a network appliance, was the limit to complexity. It wasn't just a matter of storage and logic, processor, etc., it was endemic in the nature of feedback that each factor contributed in unknown and exponentially complectifying ways. Eventually the house was boarded up and I believe they tore it down in order to put something else in the spot. They might have redone it somewhere on the property, but I'm not up on the final chapter. Similar experiences can be seen in every area of automation. Windows has this kind of problem. The more you try to link together simple entities, the more complex is the superstructure needed to control the aggregate. At some point error has to be introduced simply to save complexity from mechanically generating chaos, because error randomly introduces countervailing forces that steady the system complex. It may be that the greatest value of human intellect is its propensity to error. It is the saving grace that you can't introduce into machines, at least not yet. It would be difficult to program a machine to unperiodically throw a temper tantrum. On the other hand though we don't program it, we all run Windows, so we know that MSFT is into that difficulty.