To: David Eddy who wrote (1094 ) 1/3/1999 1:25:00 AM From: Cheeky Kid Respond to of 1361
David, I posted that because he was clearing up some stuff that Yourdon said in his book Fallback. Dick Mills makes GOOD POINTS. The 2nd draft of fallback has appeared. The title has changed to Time Bomb 2000 and chapter 5 became chapter 3. Here's part:The reason I'm explaining this here is that Fallback's premise that Y2K caused problems might damage generators and other equipment, thus resulting in month or longer outages, is not a reasonable extrapolation. In my opinion, Fallback's statement, "The most likely scenario, in our opinion, is the blackout that lasts for a couple days; a less likely scenario, but one we feel should not be ignored, is the one-month blackout. Why? Because it could take that long to fix whatever Y2000 problems are discovered in the hours after midnight on December 31,1999; and it could take that long to restart the system.", is unfounded. It shows ignorance of protection versus regulating functions. Here's another: The book makes one serious misstatement in Chapter 5. ".. the computer software for electrical generating units has been written more carefully, and tested more thoroughly, than the business software in most companies. But these systems do have date calculations embedded within them (e.g., to regulate electrical generation or distribution in accordance with traditional hourly, daily, or seasonal variations in demand)" The misstatement is important because it implies that date calculations are at the core of the principles of operation. Such is not the case. Here's another: I recall hearing a statement regarding the Apollo moon mission launches. It is true that a single loose wire, or a weak rubber O ring can cause disaster. On the other hand, the Apollo mission was so complex, that if every component was required to work correctly for the mission to succeed, then only one in five million launches would have succeeded! It is misleading to emphasize the vulnerability to simple failures, while failing to point out the overall robustness. This logical flaw seems common in the way many commentators on the Y2K problem have been approaching the problem. Allow me to pick on a typical example from Fallback Chapter 13. There is a federal (US gov) standard for a car's emissions control systems logging any failure conditions of the components, for example a fuel injector being open longer than it is supposed to due to some dirt in the fuel. The name for the logging system is OBD2 (On Board diagnostics 2). This next is informed speculation. Suppose there is date-time logging for the failure. Suppose that the date routine for some of the software is (surprise) not Y2K compliant. Suppose the failure mode is either to lockup or refuse to run the car (unlikely but not impossible). I have seen statements in both directions - that some automotive engine control processors will or will not fail after The Day. It seems to me that it is likely that those who know (because they wrote the software) are probably contractually bound to keep quiet. The rest of us are just guessing. The supposition is that such a failure has a serious effect on the primary function of the car. The credibility is buttressed by the suggestion that corporate types are engaging in a conspiracy of silence. In many cases, (not this one), it ends with an appeal to you to spread the word so that someone in authority might do something about it. The first thing that should strike you about this example is that it precisely follows the recipe for creating a successful urban legend. I don't want to be pejorative, but I do want to point out this as one reason why the whole Y2K problem has trouble gaining credibility in some circles. Our anecdotal examples, are very hard to distinguish from urban legends. Here is something else I thought was good: The logical flaw in the above example, and in many others, is that they implicitly invoke Murphy's Law to predict the behavior of complex systems. That's wrong. Briefly, Murphy's Law says, If anything can fail, it will. It was never intended to apply to forecasts of behavior, but rather as a ground rule as a design standard. Mr. Murphy's son recently clarified this in a public statement. He said: I would suggest, however, that Murphy's Law actually refers to the CERTAINTY of failure. It is a call for determining the likely causes of failure in advance and acting to prevent a problem before it occurs. In the example of flipping toast, my father would not have stood by and watched the slice fall onto its buttered side. Instead he would have figured out a way to prevent the fall or at least ensure that the toast would fall butter-side up. Murphy and his fellow engineers spent years testing new designs of devices related to aircraft pilot safety or crash survival when there was no room for failure (for example, they worked on supersonic jets and the Apollo landing craft). They were not content to rely on probabilities for their successes. Because they knew that things left to chance would definitely fail, they went to painstaking efforts to ensure success. EDWARD A. MURPHY III, Sausalito, Calif. If Murphy's law did apply to real complex systems, we never would have had a successful rocket, or airplanes or computers to program in the first place. They would almost never work. SEE:albany.net SEE:albany.net SEE:albany.net Am I misreading his comentary?