SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Zitel-ZITL What's Happening

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Steven M. Kaplan who wrote (1282)11/29/1996 5:19:00 PM
From: Steven M. Kaplan   of 18263
 
A Comparison of Procedural and Data Change Options for Century Compliance
by Andrew Eldridge, CBSI and Bob Louton, GTE

Many data processing organizations are beginning to deal with the issue of the
Year 2000 challenge within their business applications. It has become clear that
"millennium compliance" (sometimes called "millennium conversion", "century
compliance", or "the Year 2000 Problem") is one of the largest maintenance tasks
ever faced by an IS group. Every program, application, database, and line of
code is potentially affected. Since it a crucial issue in the survival of
computer-dependent businesses and organizations, it is also one that must be
managed rather than simply delegated.
At the technical level, there are fundamentally two options for changing software
to prepare it for the new millennium. These are the "procedural" solution, and the
"data" solution. For those organizations still analyzing the problem, the choice of
the approach is an important decision which will impact every aspect of the
process, from the impact analysis, through to the conversion. Each approach has
its strengths, and the chosen approach will differ for each organization and
possibly each application. This article sets out to examine the issues, and evaluate
the strengths and weaknesses of each approach.
Defining The Issues
Firstly, it is necessary to define the Year 2000 issue. (Those of you familiar with
the basic issues may wish to skip the next two paragraphs) Software applications
frequently manipulate dates represented by only 2 digit year values e.g.,. 96 for
1996. When working with dates only from the 20th century, year arithmetic and
comparisons yield correct results e.g.,. 20 years before the current year, 1996,
can be calculated by subtracting 20 from 96, yielding 76. Similarly, in a
comparison, 76 (representing 1976) is less than 96 (representing 1996), as we
would expect.
Now, let's say the current year is 2003. If we perform the same arithmetic,
subtracting 20 from the 2 digit year yields -17, not 83 as it should. Also, although
we know 83 (representing 1983) is an earlier year than 03 (representing 2003), in
a comparison 83 is NOT less than 03. Programs with these kinds of logic and
calculations will malfunction as systems begin processing dates for years beyond
1999. Strictly speaking, this mathematical limitation is the extent of the Year 2000
issue. It relates to the fact that data has "rolled over" into the next century.
Data May Become Ambiguous
A secondary issue which arises with 2 digit years is that the data may ultimately
become ambiguous. For example, let's take a license expiry date, and assume that
the license expiry can never be more than 10 years in the future. If the current
year is 96, then an expiry year of 01 could only be interpreted as 2001, and the
century has therefore been inferred without ambiguity. If we kept a history of
these expiry dates however, and we kept the history for more than 100 years,
ultimately we wouldn't be able to tell the difference between 01 representing 2001
and 01 representing 2101. With historical data this may become an issue, and in
these cases there is no choice but to record the additional information i.e.,. the
century digits, in order to make the data meaningful. The loss of meaning in the
data occurs when newly stored historical data contains the same values as dates
100 years before or after. If the application's earliest historical 2 digit year is 65
(for 1965), this won't be a problem until the year 2065 needs to be stored in the
same database i.e.,. it does not have the same urgency as the Year 2000
problem. It is necessary to distinguish the issue of ambiguity caused by 2 digit
years from the Year 2000 rollover effect caused by 2 digit years.
The terms "century compliant" and "millennium compliant" have been used to
describe applications which do not suffer processing errors when years from the
21st century are processed within the application. The important observation is
that it is possible to permanently correct an existing application either by changing
the data to include century digits in the years (the "data" approach), or by
changing the processing to deal with the century rollover complexities of 2 digit
years (the "procedural" approach). In other words, century compliance does not
mean necessarily that century digits must be explicit or that dates should follow a
specific format. For practical reasons, century compliance means simply that
Y2K events will not occur, however the software is designed or changed to
achieve this. Naturally, a 4 digit year is one means toward this end. Yet many
applications can operate successfully across the century change by inferring the
century from 2 digit years. When performed correctly, either approach results in
the desired century compliance, and the argument about whether the data was
wrong or the code was wrong is a moot point. Thus a definition of compliance
should permit both the "data" and "procedural" approaches.

Arguments supporting each approach typically presume that only one can be the
right answer. Supporters of the data solution will argue that the mathematics were
always correct, but the data was incomplete. Supporters of the procedural
solution will counter that the century is usually inferable and that the code should
have been written better. It matters little whether you determine that the bridge
was too low, or the load was too high, when altering either or both will get you
past the obstacle.
The choice between the data and procedural approaches should be a cost-benefit
trade-off rather than dogmatic adherence to either. Each offers advantages and
disadvantages. Rarely would either be categorically favorable for all of an
organization's software. Addressing the Year 2000 challenge is fundamentally a
software maintenance activity of worldwide proportions, and the optimal
approach for each application must take into account the condition of that specific
software and its role in the business process.

Triage: Priorities For Compliance
As the year 2000 approaches, companies will experience increasing pressure to
ensure that their systems will operate across the century change. Priorities may
change, and a more expedient approach to century compliance may become the
only solution. An IS department should have the following priorities in decreasing
order of importance:

1-Keep the company in business
2-Keep mission critical computing resources functioning and producing
correct results
3-Make all computing resources century compliant
4-Ensure that new computing resources are century compliant
5-Make all computing resources compliant with industry standards

This priority list suggests that if your organization has the time and money, the
computing resources should be made compliant with standards such as ANSI
X3.30 or ISO 8601. Other organizations might only be able to perform selective
century compliance on mission critical systems.

Myths About the Data Approach

If you change the data, you won't need to change the source code
At the very least you will need to change data declarations and recompile.
If the compiler or other horizontal software such as the DBMS or CICS is
now at a higher version than when the software was last compiled, the
software may operate differently than before. In addition, you will need to
remove special logic to deal with century rollover of 2 digit years. By far
the largest impact is reworking all the special year manipulation logic which
moves 4 digit years to 2 digit data items and vice versa. Furthermore, new
logic will be required to improve the functionality of the user interface on
screens.

There will be no messy code required to infer the century from a 2
digit year

OS/VS COBOL and VS COBOL II do not support 4 digit years in the
system date, and applications accessing this date will still need to infer the
century. (This is a special case of internal bridge logic). Bridge programs
accepting input from external systems may need to infer the century. If
users are allowed to enter 2 digit years, the century needs to be inferred
here also.

Changing to 4 digit years will eliminate all logic errors

Applications can contain date arithmetic, such as subtracting 1 from the
year portion of a Gregorian date. Changing to 4 digit years corrects the
problem of a negative result in the year 2000, but will not address the more
subtle condition caused by subtracting 1 from the year when the date is 29
February in a leap year.

Changing the data means fixing the problem once, whereas fixing
the code means fixing every program that accesses the data

Unfortunately, the data is key, and changing its structure has a large ripple
effect. When the data changes, so too must its copybooks. If the record is
referenced in a sort, the control statements may need to change. If the
record is stored in a VSAM file, the DEFINE must change and so on.

Arguments For the Data Approach

1-Easier code upgrade effort required, and results in simpler date logic in
programs.
2-Eliminates 2 digit year ambiguity and the year 2000 rollover issue
simultaneously.
3-Eliminates overhead of documenting complex date logic, and hardcoded
century values.
4-As more and more interfacing systems carry 4 digit years, bridging
requirements decrease. Data is more portable when the majority of other
applications use 4 digit years.
5-Enables a consistency of approach with newly developed century
compliant systems.

Arguments Against the Data Approach

1-Changing data structures impacts the whole system: programs, copybooks,
sysins, jobs, DCBs, sorts, VSAM defines. Portions of the system with no
direct date functionality are also impacted requiring recompiling and testing.
2-Flow-on effect to data warehouses and other duplicated data.
3-Every recompiled program must be tested, or the whole system must be
tested.
4-Data conversion process can be costly. Archived data may require
conversion or support by special logic. The cut-over is complex, potentially
needing to support both the new and old data formats simultaneously.
5-Increases space required and may impact processing volumes and speeds.
6-Exposes the system to a significant change with associated risk. The look
and feel of the system is changed.
7-Increases the complexity of configuration management with parallel
production enhancements.
8-During system testing and/or cut-over, disk requirements may peak at
twice the production data volume.
9-Test requirements are as for a major system change, and may require
special considerations such as dedicated test regions.
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext