>>>Remember I said this would be a benefit, updating all these old computers to faster computing power able to run high speed Internet connections. It's going to make business more effective and help the Internet grow at a faster pace.<<<
While someone might infer that updating old computers would make business more effective, as with any upgrade, there are growing pains. Programmers and users must be trained on how to use new features and workarounds must be found for any lost functionality. It will be some time before these improvements find their way into the applications
When I decided to start working Y2K projects about 3 years ago, this is one of the "Y2K myths" that I had to see for myself. I was hoping that management would use Y2K to help clean out the "dead wood" that exist in most IT systems. Unfortunately, it isn't happening. On the four Y2K projects that I have worked on, not one has implemented procedures to maintain these inventories. There were used once for the Y2K projects, then shelved.
I addressed some of the optimistic viewpoints in an earlier posting:
Message 11094567
Also, from: cnnfntech.newsreal.com
<<<...computer systems and applications have often evolved in isolation; corporate leaders often don't understand the technology they use, relying on "techies" who have different loyalties and who often don't talk even to one another.
Meanwhile, those dispersed and varied systems are tied to one another with ad hoc standards, roughly comparable to the different plug adapters used to make a hair drier work in a foreign hotel.
There often has been little oversight of technology departments within organizations. And computer departments have always been notoriously slow to meet deadlines, Kappelman said.
But the mammoth challenge to find the "bug" has changed all that, or least it so appears, Kappelman said.
Top executives are finally getting a handle on the technology they use. Federal, state and local governments are engaged in a long- overdue housecleaning of their technology, and the public is becoming acutely aware of the ubiquitous nature of computers.>>>
Maybe Kappelman believes that the mammoth challenge of Y2K has resulted in a change in human nature, but it is not what I am seeing in the field. Computer systems and applications still evolve in isolation, corporate leaders still don't understand the technology they are using and "techies" still seldom talk to one another.
It seems to me that most Y2K projects set their scope to a size that can be accomplished on time while still trying to be "all encompassing". My biggest concern with all of the organizations that are saying they have completed their Y2K project (or will complete them by December) is; what did they miss?
My current client is still doing Y2K time machine testing and still finding bugs. Just this past week a Y2K leap year bug was found in our remediated code. This code was deemed to be Y2K compliant and put back into production almost 2 years ago.
The programmers and managers were confident that they have fixed all of their Y2K bugs. The remediated coded had past their regression testing and Y2K testing. It wasn't until we "aged" our data and started running Y2K time machine testing that the hidden subtle bugs began to appear.
In reviewing the code, I noticed that the problem program was using a window period that is different from the window that was being used in the programs that I modified. It appears that different programmers used different windowing periods to make the code Y2K compliant. Just another example of "techies" not talking to one another, systems and applications operating independently of one another and management's failure to grasp the size and complexity of IT systems.
Organizations that do not do Y2K time machine testing are going to have a lot of unexpected problems start occurring next year. I hope that they exercise due diligence in reviewing the output from these systems next year, but I wouldn't bet on it.
B.K. |