>> The MatriDigm Code Analyzer, the foundation for this MatriDigm Conversion Service, captures the complete business rules and data interactions of each application in the MatriDigm Relational Hyper-linked Database. This allows MatriDigm to analyze applications as a whole; as a result, MatriDigm is able to employ its unique data identification technique to find virtually all of the date fields and logic within an entire application. <<
REACTION: No systems analyst can lay out all the business rules within an application, but Matridigm claims these can be captured just by scanning through the code ... but programs work through logical definitions, that is 2 fields FIELD-A and FIELD-B are meaningless until execution time when input data determines their usage ... yet I don't think Matridigm uses the actual physical files in its scanning process ... the other problem with business rules is when you're dealing with code, you deal with switches, and with each switch you get an exponential number of possible combinations ... I seriously doubt these much detail can be captured ...
<< MatriDigm's recommended Year 2000 solution uses a set of MatriDigm Packed Binary subroutines to detect and process date fields at execution time, providing a unique 2-byte representation of a 4-digit century date field. (MatriDigm's solution also allows the expansion of date fields to a traditional 4-byte, 4-digit zoned decimal date field if desired.) This novel 2-byte "packed binary" approach enables both converted and non-converted applications and data files to coexist in the same production >>
REACTION: I can't believe they will try to patent a programming technique that's been used for years by programmers ... Often dates are either stored as character or packed data ... What Matridigm is proposing, I believe, is to append a century to a year field during execution, and for the purposes of storing it in the same number of bytes, manipulate it as a 'packed binary', which is essentially nothing more than converting the century-year , e.g. '2015', to packed decimal format X'02015F', multiplying by 10 to get X'20150F' then dropping the last byte to get X'2015', a two byte representation ... This technique is not at all 'novel' and I seriously doubt it would stand up to patent challenge ... as for the date routines ... it will take a programmer worth his salt at most a month to come up with these, after all, these routines function will just be to validate dates, convert to other formats, compare dates, and compute for the difference ...
<< Speed < the MatriDigm system can find, fix and test code at least 20 times faster than any other known approach. This means that customer code maintenance can be frozen for a very short period of time, while MatriDigm completes the year 2000 code conversion, thus eliminating the need to reintegrate continuing code maintenance. >>
REACTION: Where did they get the 20X ??? Has there been an independent study done on this?
<< Flexibility < both converted and non-converted applications, programs and data files can coexist in the same production system, enabling a step-by-step conversion process if desired. This is facilitated by the MatriDigm Date Subroutines which, during program execution, identify converted and unconverted date fields. Unconverted fields are converted to the MatriDigm 3packed binary2 format >>
REACTION:
This approach is being used now ... what it simply means is that since files are not expanded, programs that use the same file but do not have to be converted at the same time ... this is not a unique solution ... One thing about files not being converted/expanded is that when dates used as key fields are involved, they would affect the sorting sequence. I don't see any mention of Matridigm converting files, which I think is really better done in house because of security problems, physical data structure problems, and timing considerations (you don't want to send them files for conversion if they're being used real time, obviously) So I believe, the Matridigm approach will not work for the scenario above.
<<Automation < the MatriDigm Date Modifier employs a rules based approach to modifying the code - in no instance does a programmer modify the customers source code directly. If an exception is encountered, the rules are modified and the source code is rerun through the systems. >>
REACTION: Every rule has an exception, including this rule. At some point, the rules will be so complicated, they ought to write a program.
<< Testing --- not only is the date identification and conversion process automated, saving months of time and a tremendous number of errors; the MatriDigm system also uses an automated and proprietary testing process to ensure that converted code functions exactly as did the original. MatriDigm1s testing process tests code fixes at the code fragment level, which enables the exercising of essentially all boundary conditions on the converted code, insuring that a correct program is returned to the customer. As a result, only minimal system level testing is required. >>
REACTION: This is C.R.A.P. If for this alone, installations shouldn't even consider Matridigm for recommending that unit tests not be done against each converted module. Matridigm DOES NOT really do the TESTING in the same context as what normal data processing entails. How can you test unless you compile and link the code??? There is also no reason for Matridigm to include this feature unless it is to mislead the investing public. They use date routines, and since these are 'black boxes' they have been exhaustively tested prior to becoming part of the conversion, hence there is really no need for their so-called 'fragment' test
CONCLUSION: Matridigm's product has its merits and may be a good tool it's not the giant leap for mankind that its been hyped to be ..
DISCLAIMER: This is not a solicitation to short the company, just an opinion of a mainframe programmer on the loose |