Jim, It wouldn't crash. - re: yr 2000 stocks
A. Back when JCL was a big deal (yes, I did do some, and RJE, etc.) IBM mainframe clocks weren't the same, nor was Cobol. Yet that legacy code was still upward compatible. As I remember it, when we used to cold boot the things from register switches ( every byte of machine code input thru its own bit switches) we also put in the date, thru the console or a date card. Back then i believe we would still have to fake our way around leap day, and problems like that. Hard to remember, after 30 years.
B. Also, the date from the clock is just data. Pick out the 2 digit year, discard the rest, save a couple of bytes, pat yourself on the back, write it to the file and Voila! A nasty problem. The hardware might be there to do it right, but, you know, leading horses to water, etc.
I'm not saying that the problem is the mainframe or Cobol per se. I'm saying that those boxes have been around for a very long time and a lot of the old code uses old techniques, so the mainframes have the problem more than anything else *because they have more old code.* Legacy code that people have ported forward without fixing for decades, in some cases. To some extent this is a side effect of IBM management decision long ago to make Cobbs relational syntax a limited sublanguage within Cobol rather that a stand-alone file and programming system right away, which is what he wanted.
Mainframe programs written with SQL DBs, like DB2 or Oracle, are not a problem, because you had more enforcement of using the dates correctly, both from the system and your DB Administrator, who would fire you for doing it wrong if you refused to listen, assuming he/she were a pro. You wanted to do it right anyway, because the system would record millisecond exact time in the same data item for you, and let you add and subtract dates and times without coding date addition routines yourself. Also, it made all the logging and so on consistent, so you have had to be a real masochist to try to defeat the system vis-a-vis dates.
Meanwhile, for file-oriented programmers, truncating to the last 2 digits of the date was a commonplace, like I said, especially with card systems. And believe it or not, there are still people out there using paper cards. Lots of people ported card programs to tape, but kept the same file format. (By the way, cards are the reason for the RJE and JCL statement length limit.) So if you didn't want to exceed 80 bytes, you would abbreviate things. Remember you had to type a lot of that stuff into card punches. Anyway, this would just be carried forward from cards to tape to disk, because the programs worked. And programmers would continue to think the same way and make programs like that when they didn't need to.
Most of the other brands of computers came along after cards were dying out. Their programmers came fresh from school without the bad habits in most cases - different culture. So not as much of a problem - still there is some problem on those machines.
Still, a good percentage of ridiculous screwed up limitations in hardware and software are because some guy decided you would never need more than 8 or 255 of something, or two digits, for something crucial like the number of serial ports on a system or the number of processes or users max on a system, and so , of course, this retarded mutant can save 2 whole bytes for something else. And thinks he's efficient.
Maybe someday, we oughta have tests before people get to program these things for other people, eh? And a license of some kind. Gross stupidlity in programming does cost the country billions a year.
Rant, Rant, Rant. Sorry. Forgot this was an Investment forum for a minute, there. |