SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : VIAS VIASOFT & THE Y2K PROBLEM

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: TEDennis who wrote (362)6/6/1997 11:50:00 AM
From: TEDennis   of 2067
 
All: Matridigm ... See post #364 for background information on the VIAS user conference and disclaimers regarding these posts. This particular post will focus on Matridigm and their involvement with the Y2K situation.

I already posted my initial reaction and a followup on the ZITL thread. The rest of this post is more technically oriented, and contains more background info for some of the comments I have already made.

There has been a lot of discussion on the SI threads and elsewhere regarding the price of ZITL stock and its reliance on the success or failure of Matridigm's Y2K solution. This note will not address those issues because I am not qualified to comment on them. If you're looking for investment advice, you can go on to the next post. The information contained herein are solely my opinions and are the results of my technical analysis of the Matridigm presentation at the VIAS user conference.

I currently have no trading interest in either VIAS or ZITL.

I've been waiting for several months to get a first hand look at what Matridigm has to offer. I finally got to see it because they had a booth set up at the VIAS conference, and were giving demos of the MAP2000 product. It is a very nicely constructed set of control and management modules. Very clean GUI interface wrapped around some well thought out functions that are requirements for a successful Y2K adventure.

The fellow who gave me the presentation (forgot to get his name) was pleasant enough, and said he was an "engagement manager", or some such thing. His function is to assist the Matridigm VARs during their onsite engagements. Wasn't quite sure whether he went on site with them, or if he just trained them so they'd know what to do when they got on site. He started to give me the walk-thru of the product processes, but since he had 4 windows open on the screen, I could immediately see what their purpose was and could envision how they would be used. Since I was really pressed for time, I opted to skip that part of the "show and tell". I wanted some real hard-core "techie" information, so I asked him to show me some converted code.

Since Matridigm has pretty much ditched their proprietary format (packed binary), what he showed me was a very simple COBOL program (maybe 25 lines of code?) that had a leap year calculation in it. This was post-conversion, which is what I wanted to see. Very impressive ... the product recognized the divide by 4 as a leap year calculation, and adjusted the code to "correctly" calculate the leap year (that discussion is beyond the level of this discussion, but it can get quite complex).

He showed me another small code example that had a date comparison in it, and how the tool automatically inserted a "windowing" code snippet. Also very nice. Very clean adaptation of code. He bragged about how there was no call to a subroutine, so performance was not impacted. Well, I know better. Code was added, so performance was impacted ... just less so than if the tool had inserted a call to a subroutine which did the same function. Everything is relative ...

Having built many demo systems myself, I immediately recognized the simplicity of the examples. Since I've been involved with this problem for several years, I know where the "gotchas" are that cause most of the automated tools fits. I asked him to show me an example where the date comparison is nested in a series of "IF" statements. He didn't have any, and didn't know what the tool did in that event. That was disturbing to me. Perhaps the Matridigm specialists only have a cursory understanding of the Year 2000 problem and what their tool does to fix the glitches. In the nested "IF" situation, the likely fix would be to call a subroutine, which is what was specifically avoided in the prior example. I was also curious to see what the "structure" of the code looked like. Did it follow the current indenting technique, or was it just haphazardly placed inline?

Granted, I didn't have the time to explore in depth. Perhaps there is much more to their conversion technology than meets the eye. If I had the time, I would have asked to be able to "drive" the demo. Get my hands on it and look for the underlying "gotchas" and see how they reacted to it. I got a very uneasy feeling about the extent of quality coverage from the demo. The tool is very pretty, and looks very functional. I'm a techie, though, and want function over form. I don't care what color it is, or how many scrolling windows it has, as long as it works and solves my problem. It seems to me that a lot of time was spent on the GUI aspects of the product, and not enough on solving the Y2K problem.

The VP of Development (Linda Duyanovich) gave the presentation (I saw the demo AFTER the presentation, by the way). Since I expected her to be very techie, my expectations for her public speaking skills were not very high. She fulfilled my expectations on the speaking side, but was surprisingly ineffective on the techie side. She seemed very tentative, and not very confident. She does know the functions of her product, and knows the processes involved in fixing the Y2K problem. She did have a product knowledgeable techie (Brent Ross) to support her when she was asked questions.

I was disappointed in her understanding of the environmental concerns. This is where the rubber meets the road. Lots of tools say they can "automatically" change code. There are many techniques to use. Some are better than others, but none are legitimate leaps in technology over the others (as was originally thought about the packed binary concept). But, the products that can satisfy the needs of the majority of the marketplace are the ones that will make the most sales. If your tool doesn't work in my environment, don't bother calling me.

What surprised me the most was the lack of supported environments. Currently, only the COBOL 85 compiler is supported (which explains why there can be code snippets added without subroutines ... they used the IF/ENDIF constructs which weren't available in prior versions of the compiler). They're working on supporting the COBOL 74 compiler, which is where the majority of legacy code was written. No estimate on availability. It will be tougher to do these automated changes with that version because of various syntax constructs.

They don't support any of the common source library managers, such as PANVALET, LIBRARIAN, Endevor, Changeman ... the list goes on and on. They do support the standard IBM MVS PDS libraries. The true blue IBM background of the big wigs in the company probably had a lot to do with that. The support for PAN and LIB is "forthcoming". Having built interfaces to those in the past, I know that they're in for some interesting surprises. Particularly, since those products work on mainframes, but not on the Windows NT platform that MAP2000 executes on. They'll have to emulate EVERY function that those tools provide. Not an easy task. I'd be willing to bet that if you have one of the lesser used library managers, you won't be able to use MAP2000 in time to fix your problem prior to 2000.

They currently support MVS batch, CICS, and IMS. So, all the primary IBM environments are supported. Not sure about DB2. No mention was made of any of the "also rans", like IDMS, TOTAL, etc.

The testing process they use is quite thorough. For every line of code that needs modification, they extract it and all the data definitions it needs to a temporary file. Then, they make it look like a mini COBOL program, compile and link it, then drive it with lots of pertinent test data. They do this with both the old and the new code and compare the results. Theoretically, this will ensure that the converted code will execute the same when it is returned to the client. In the case where one statement has many sub-statements (a complex calculation, for example), the statement is broken up into simpler ones. I think there is a slight window of opportunity where intermediate results might be incorrect, but it's not much to worry about. When the code is given back to the client, the client still has to run thorough testing with their own data and within their own environment. However, the statement level testing should eliminate any obvious goofs. Not sure how important that step is, but according to Linda, it doesn't take very long and is therefore recommended.

In summary, my hopes for a solid Y2K conversion solution were not met. From all the press this was getting, it sounded like they had a major jump on the rest of the industry. The skeptic in me didn't believe it before, and I certainly don't now. This product has a future, but there are many environmental concerns which must be met before it will be viable in a large mainframe shop where ideosyncracies abound. If you're a small, IBM, PDS only shop that has standardized on COBOL 85, then this tool will work for you today.

I recommend that Matridigm invest in several experienced mainframers to help them with their task. I think they have their GUI down pat, but GUI folks who grew up on Windows 3.1 don't understand the mainframe issues.

If you have any questions, please ask.

Regards,

TED
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext