tags:

views:

307

answers:

10

I know that a lot of Y2K efforts/scare was somehow centered on COBOL, deservedly or not. (heck, I saw minor Y2K bug in a Perl scripts that broke 1/1/2000)

What I'm interested in, was there something specific to COBOL as a language which made it susceptible to Y2K issues?

That is, as opposed to merely the age of most programs written in it and subsequent need to skimp on memory/disk usage driven by old hardware and the fact that nobody anticipated those programs to survive for 30 years?

I'm perfectly happy if the answer is "nothing specific to COBOL other than age" - merely curious, knowing nothing about COBOL.

+1  A: 

It seemed to be more a problem of people not knowing how long their code would be used, so they chose to use 2 digit years.

So, nothing specific to COBOL, it is just that COBOL programs tend to be critical and old so they were more likely to be affected.

James Black
A: 

It was much more related to storing the year in data items that could only hold values from 0 to 99 (two characters, or two decimal digits, or a single byte). That and calculations that made similar assumptions about year values.

It was hardly a Cobol-specific thing. Lots of programs were impacted.

Bob Riemersma
+3  A: 

Yes and No. In COBOL you had to declare variables such that you actually had to say how many digits there were in a number i.e. YEAR 99 Declared the variable YEAR such that it could only hold two decimal digits, so yes it was an easier to make that mistake than in C were you would have int or short or char as the year and still have plenty of room for years greater than 99. Of course that doesn't protect you from printfing 19%d in C and still having the problem in your output, or making other internal calculations based on thinking the year would be less than 99.

stonemetal
+1  A: 
voyager
Are you implying that non-cobol programmers have intrinsically longer worldview? :)
DVK
No. I'm implying that *programmers* have intrinsically small worldview. Would *you* have foretold COBOL's survivability?
voyager
+1  A: 

Fascinating question. What is the Y2K problem, in essence? It's the problem of not defining your universe sufficiently. There was no serious attempt to model all dates, because space was more important (and the apps would be replaced by then). So in Cobol at every level, that's important: to be efficient and not overdeclare the memory you need, both at the store and at the program level.

Where efficiency is important, we commit Y2Kish errors... We do this every time we store a date in the DB without a timezone. So modern storage is definitely subject to Y2Kish errors, because we try to be efficient with space used (though I bet it's over-optimizing in many cases, especially at the enterprise overdo-everything level).

On the other hand, we avoid Y2Kish errors on the application level because every time you work with, say, a Date (in Java, let's say) it always carries around a ton of baggage (like timezone). Why? Because Date (and many other concepts) are now part of the OS, so the OS-making smart dudes try to model a full-blown concept of date. Since we rely on their concept of date, we can't screw it up... and it's modular and replaceable!

Newer languages with built-in datatypes (and facilities) for many things like date, as well as huge memory to play with, help avoid a lot of potential Y2Kish problems.

Yar
I like your asnwer, though I'm not sure "lack of complicated datatypes/libraries" can be counted as a COBOL specific feature.
DVK
DVK, definitely not. I don't think that Assembler programs from the same time period avoided the Y2K bug more than Cobol programs of the time.
Yar
+1  A: 

It was two part. 1- the age/longevity of Cobol software, and 2- the 80 character limit of data records.

First- Most software of that age used only 2 digit numbers for year storage, since no one figured their software would last that long! COBOL had been adopted by the banking industry, who are notorious for never throwing away code. Most other software WAS thrown away, while the banks didn't!

Secondly, COBOL was constrained to 80 characters per record of data (due to the size of punch cards!), developers were at an even greater pressure to limit the size of fields. Because they figured "year 2000 won't be here till I'm long and retired!" the 2 characters of saved data were huge!

Erich
Question is, was any of the reasoning COBOL-specific? E.g. was COBOL still 80-character constrained with data records once punch cards stopped being used?
DVK
The 80 character COBOL limit stuck around for a very long time after punch cards for backwards compatibility, and the fact that screens were made 80 characters wide as well!COBOL was for some reason, just picked up by banks, mostly due to its easy to use syntax at the time, and its power. The alternatives at the time were not very plentiful (FORTRAN, C, etc) very confusing, and didn't lend themselves to the type of required transactions.The 80 character limit was a design decision made at a time when it made sense, and stuck around for long enough it caused this problem.
Erich
Was it somehow hard-coded in the language? or merely the non-enforced standard that everyone used? I assume the former.
DVK
80 columns had nothing to do with it. That was just the coding format and code could be continued on the next line. The size of data records could be, and was much larger.
Duck
@Duck - Thanks!
DVK
A: 

There were some things about COBOL that aggravated the situation.

  • it's old, so less use of library code, more homegrown everything
  • it's old, so pre-internet, pre-social-networking, more NIH, fewer frameworks, more custom stuff
  • it's old, so, less abstract, more likely to have open-coded solutions
  • it's old, so, go back far enough and saving 2 bytes might have, sort of, been important
  • it's old, so, so it predates SQL. Legacy operating software even had indexed record-oriented disk files to make rolling-your-own-database-in-every-program a little bit easier.
  • "printf" format strings and data type declaration were the same thing, everything had n digits

I've seen giant Fortran programs with no actual subroutines. Really, one 3,000-line main program, not a single non-library subroutine, that was it. I suppose this might have happened in the COBOL world, so now you have to read every line to find the date handling.

DigitalRoss
To summarize - it's the age, stupid! :)
DVK
+8  A: 

It was 80% about storage capacity, pure and simple.

People don't realize that the capacity of their laptop hard drive today would have cost millions in 1980. You think saving two bytes is silly? Not when you have a 100,000 customer records and a hard drive the size of a refrigerator held 20 megabytes and required a special room to keep cool.

Duck
Having programmed on (1) Sinclair Z80 with cassete tape as the only storage device; (2) 8088 Zenith laptop with 2 750K floppes as the ony storage device; (3) Early IBM mainframe knockoff with 4K memory chips and punched cards/tape for storage - having done all that, I can honestly say that I fully understand the sentiment of storing 2 digits per year :)
DVK
100% correct. Add to this that when the decision was made to use a 2 digit year they knew there would be issues in the year 2000, but when most of this was coded, 1980's, 70's and even 60's, the thought was that the application being written would not be in use for 20, 30 or 40 years.
tonyriddle
A: 

COBOL never came with any standard date handling library.

So everyone coded their own solution.

Some solutions were very bad vis-a-vis the millennium. Most of those bad solutions did not matter as the applications did not live 40+ years. The not-so tiny minority of bad solutions cause the well-known Y2K problem in the business world.

(Some solutions were better. I know of COBOL systems coded in the 1950s with a date format good until 2027 -- must have seemed forever at the time; and I designed systems in the 1970s that are good until 2079).

However, had COBOL had a standard date type....

03 ORDER-DATE     PIC DATE.

....Industry wide solutions would have been available at the compiler level, cutting the complexity of any remediation needed.

Moral: use languages with standard libraries.

Sunanda
A: 

COBOL 85 (the 1985 standard) and earlier versions didn't have any way to obtain the current century**, which was one factor intrinsic to COBOL that discouraged the use of 4-digit years even after 2 bytes extra storage space was no longer an issue.

** Specific implementations may have had non standard extensions for this purpose.