NEW REPORT AVAILABLE Y2KNEWSWIRE.COM has just completed a new report, available free of charge, entitled, "Y2K Frequently Asked Questions." You can access the report at: y2knewswire.com
90% OF COMPANIES MISSING Y2K DEADLINES Cap Gemini, a large player in the financial services automation industry, has been tracking the Y2K progress of 127 public and private companies. Last week, they announced that 90% of the companies being tracked had missed Y2K-related deadlines. And the trend is looking ugly: the number was only 78% in April. And 44% have already experienced Y2K-related disruptions of some sort.
The story goes on to say: Local organisations have also seen deadlines blow out as the full scale of their projects is revealed, according to the Y2K industry program chief executive officer, Mr Graeme Inchley. "It is going to be tight right up until the final deadline as companies realise it is going to take much longer than first expected, " Mr Inchley said.
"Blow out" is the right phrase here, of course, because that's what we're going to see approximately a year from now. Companies and federal agencies are currently making baseless assumptions about large-scale software remediation timelines, and as with all large software projects, the spokesperson can claim everything is on track right up until the day it needs to work: and then, there's a "sudden and unforeseen failure..." More on this below, with cited evidence.
Link at: afr.com.au
CONTINGENCY PLANNING For an excellent article on the need for contingency plans, visit: it.fairfax.com.au
And then remember, the IRS has no contingency plans.
SURVIVALISTS AWAIT Y2K This Associated Press story appeared yesterday in the North Carolina "News & Observer." It's worth a read: news-observer.com
GOVERNMENT MISSES THE GRADE The federal government's Year 2000 Progress chart, updated in September, shows us who's behind schedule and who isn't. Lets look at the good news first:
WHO MAY MAKE IT : Social Security Administration Small Business Administration Dept. of Commerce Environmental Protection Agency FEMA
WHO IS IN THE "MAYBE" CATEGORY: NASA Dept. of Agriculture Dept. of Treasury (includes the IRS) Dept. of Transportation
WHO WON'T MAKE IT: Dept. of Defense Dept. of Labor Dept. of Interior Nuclear Regulatory Commission HHS (Health and Human Services) Dept. of Energy Dept. of State Dept. of Justice Dept. of Education
In fact, the Dept. of Justice is scheduled to by compliant no later than the year 2030!
Remember, this is the claim by the federal government. Given that most large-scale software projects fall behind schedule by at least six months, and that various government agencies have proven their inability to tell the truth about Y2K compliance, we must always remain skeptical and vigilant about claims of compliance.
WHERE DO THE FIGURES COME FROM? Here's the scary part. The "grade" given to these agencies is based on information supplies by the departments themselves! To quote the report, "The primary determinant of grades is Mission-Critical Systems - specifically, the estimated completion date based upon agency self-reported current rate of progress."
The key phrase here is "self-reported." Logistically, this is equivalent to allowing a classroom of high school students to assign their own grades. "Yeah, we deserve an A+!" Any study based on self-reported figures should automatically be suspect.
But that's part of the problem here, see. There's no agency that runs around verifying compliance claims. So through 1999, as we begin to hear proclamations of compliance from both the private sector and government agencies, we really have absolutely no way of knowing if we're being lied to. And in many cases we've already seen, deceit is the default. This is unfortunate, because there will undoubtedly be some companies and even a few government agencies that will be fully (internally) compliant, and yet trusting their compliance pronouncements will be extremely difficult.
BUT INTERNAL COMPLIANCE IS MEANINGLESS Remember, however, that isolated compliance is worthless. Unless the data interchanges are compliant, and unless the majority of the other systems in society are compliant -- including power, banking, and telecommunications -- isolated, internal compliance does no good. For example, suppose FEMA declares and verifies Y2K compliance sometime in 1999. Does that mean FEMA will then function at 100% after January 1, 2000? Of course not; not unless the rest of society is running at near-perfect levels as well. FEMA can't do much of anything without telecommunications and transportation.
HOW TO KNOW WHEN THEY'RE REALLY DONE No company is really done with Y2K repairs until they fire the programmers . That's right. If you hear a company claiming Y2K compliance, immediately ask, "Did you fire the programmers?" Because if they are TRULY finished, tested, implemented, and fully confident of the repairs, there's no chance those programmers will stay on the payroll (unless they're moved to a different project).
When a company declares Y2K compliance and doesn't fire the programmers (or move them back to non-Y2K projects), you can correctly assume they are lying. In fact, this is the best external compliance check we've come across.
TAKE A LOOK AT SOCIAL SECURITY The SSA began Y2K repairs in 1989! They had almost 400 programmers on staff, fixing millions of lines of code. In six years, they finished almost six million lines of code. The SSA is the "success story" of the Y2K repair debate. They started early, they hired the resources, and they plan to finish all critical systems by March of 1999. (Remember, though, they're STILL not finished!)
Now figure this: if the SSA took ten years to fix around 60 million lines of code, what makes these other government agencies think they can do it in two or three years? Most agencies didn't even begin repairs until 1997 or 1998, and many have a lot more code than the SSA.
Apparently, the "unlimited compression" theory mentioned yesterday is alive and well at federal agencies. They seem to think any process can be squeezed into whatever time is remaining. Imagine the questioning by a reporter:
Reporter: How long will it take your agency to finish Y2K repairs?
Agency spokesperson: Ummm... how long do we have left?
Reporter: Fourteen months.
Agency spokesperson: Ummm... yes, that's it. Fourteen months.
Reporter: The repairs will take fourteen months? Exactly fourteen months?
Agency spokesperson: Yep. Fourteen months. That's our schedule. I believe we will be fully compliant in fourteen months, and I'd like to add that public safety is our number one priority.
Reporter: Shouldn't Y2K be your number one priority?
Agency spokesperson: Well, that too.
.. and so on.
Link at: freedom.house.gov
TESTING DOESN'T MEAN THE REPAIRS ARE FINISHED Y2K remediation has often been described in "phases." You typically hear:
planning and assessment repairing the code testing the code implementing the code
Problem is, this is a gross oversimplification of what really goes on. Understanding what really goes on in a large-scale software project is critical to understanding why the timelines offered by government agencies are entirely inadequate.
First of all, you never know if the code was "repaired" correctly until you test it. So when you enter the testing phase, you are going to -- without a doubt -- return to the repairing phase... probably hundreds (or thousands) of times.
Remember, too, that this testing takes place in a simulated environment. At this point in the game, the system hasn't even been implemented in the "real world."
After the software works fine in the simulated testing environment, it is "implemented." This means replacing the "old" software on the live computer systems with the "new" software. This act alone will cause disruption. You can't replace the software without bringing down the systems. So Y2K "experts" who claim there will be no disruptions are simply clueless. This implementation phase alone will cause disruption. Minor, yes, but still very real.
Now you have the system implemented. It's up and running on the live hardware, and now supposedly crunching numbers in the real world. Now, if the testing environment was a perfect rendition of the real world, the new software will work perfectly. But in almost all cases, software that is implemented in the real world -- even after it has been rigorously tested -- still needs to be tweaked. There will be overlooked circumstances related to data, processing time, or hardware differences. Or the testing environment might have been a poor representation of the real-world operating environment.
In almost all cases, even after the software is implemented, it must continue to be repaired and tested. Thus, you don't have isolated phases; you have a loop. You repair, test, implement. Then you find a bug and repeat the whole process.
In a large-scale software project, this whole process is enormously complex. Doing it right requires years , not months. This is how software really works.
AND HERE'S THE BOTTOM LINE: When companies and government agencies begin shouting, "We're compliant!" in 1999, you can be sure that in most cases, they will be announcing their completion of the FIRST round of code repair and testing. In most cases, they will not have implemented the systems in the real world, and they will not have been through the repair, test, and implement procedure more than once.
As a result, when you hear, "We're compliant!" ...you'll know they've only begun the real process of replacing the live software and letting the fixes churn on data in the real world.
WHAT EVIDENCE DO WE HAVE TO BACK THIS UP? Just yesterday, we released a report about how the FAA's new computer systems, designed to combat Y2K problems, are causing havoc over the Chicago O'Hare International Airport. The software is miscalculating airplane speeds and positions, and in more than one instance, pilots have had to take evasive action. The link was at: chicagotribune.com. html
You can bet this system looked "just fine!" when it was in the development phase. The code looked good, the tests worked fine. But then it hit the REAL WORLD. The real world is not some testing lab. It is not some software simulation of a mainframe computer with a random data set. The real world is unpredictable and inconsistent, and as the FAA is finding out, their new systems simply aren't compliant with the real world.
(Yet, that doesn't stop the FAA from claiming the system is "certified as safe!")
Y2KNEWSWIRE also reported a few weeks ago on a successful Y2K test of a Tomahawk missile. The test, which was conducted in a software-only format in a laboratory somewhere, was heralded as a demonstration of the military's ability to conquer the Millennium Bug. As Y2KNEWSWIRE pointed out, this test is certainly wonderful if wars were fought in simulated environments, but real missiles have to guide themselves in the real world, over real terrain. And until you actually launch missiles after January 1, 2000, you really don't know if they're going to work.
Furthermore, most military missiles use the GPS systems for guidance, and GPS works fine in 1998. But a year from now, on August 22, 1999, the Global Positioning System is going to lose 1024 weeks when its "week" register rolls over to "0000." And so any test of any GPS-dependent missile in 1998 is almost meaningless. How will those missiles fly in 2000 when they think it's actually 1981? Their little missile-brains will be thinking, "Gee, these stars look funny..."
Be extremely wary of claims of compliance on systems that have not yet met the real world. Because in the end, only real-world implementation counts. Everything else is hype.
- Webmaster alert@y2knewswire.com |