Jed Rothwell wrote:
Stephen A. Lawrence wrote:
Take the Y2K problem. As I said before, it was quite real. It was a
financial disaster. Society was forced to spend billions of dollars
in emergency repairs that should have been taken care of cheaply
during routing maintenance 20 years earlier.
I disagree. When you talk about "routine maintenance" in this
context I think you are imagining a world that did not exist.
I was programming computers in 1978,
Me too.
and I assure you, 90% of the work was routing maintenance of existing
COBOL programs.
OK, OK, I should know better than to argue with you. I was never part
of the COBOL world so I have a distorted view of it. In the part of the
universe where I worked _nothing_ was routine.
I wrote many date routines, such as input and Julian date conversions,
and I made darn sure they would work past the year 2000. In many
industries year-2000 dates were already in use. For example banks had
records of 20- and 30-year mortgages and leases, which terminated
after 2000.
The Y2K problems were entirely, or almost entirely, software problems.
There were entirely software problems. I have never heard of a
hardwired date routine.
"20 years earlier" than year 2000 was year 1980. Remember what
things were like? A typical personal computer was still an Apple ][
-- the Lisa (remember the Lisa?)
Personal computers have nothing to do with this problem. Most of them
were fixed in the 1990s at little or no expense. In any programming
language or system developed after 1990 it would have been difficult
to cause a Y2K problem. You would have to write your own date routines
instead of using the prepackaged ones, and you have to be both
incompetent and inventive...
Then Microsoft qualifies. Windows had some collection of Y2K problems
which were fixed via last-minute service packs a few months before the
big day.
, because the routines in programming books included year-2000
features. Some of them worked for thousands of years in either
direction, as I recall. They were used for astronomy and archaeology.
Now, what "routine maintenance" was going on in the software world?
Darn little, because almost nothing was "routine" in those days.
EVERYTHING was routine! It was so predictable, I could write manuals
blindfolded, and figure out most of the program by glancing at the
data structures.
The only established software base with anything like a "routine"
associated with it was the IBM mainframe world, where the universe
still ran on COBOL and PL/1.
In 1980 trillions of dollars of commerce in the US ran on COBOL. A
huge chunk still does. Actually, COBOL is a lot better for business
apps than C++, in my opinion. C++ is the third most popular language,
and COBOL is #13, but it is gaining.
#13?? There are 9 computer languages more popular than COBOL and less
popular than C++?
And there are two others more popular than C++?
I feel ignorant. I can't even think of 13 languages in common use
today. After Fortran, Java, C/C++, and Cobol I kind of run out of
steam. Let's see, there's Ada, is that still in common use? ... um...
Most of our software today seems to be written in C. Remember C in
1980? 6 character variable names on globals (that's right, just
six) . . .
Yup. A real nightmare. That's why I preferred COBOL or Pascal.
Unfortunately Pascal self-destructed. Which really was too bad; it was
turning into a nice language before the big implosion in ... um ...
1986, I think. The Pascal standards meeting abruptly did an about-face
and threw out all the extensions which had been under discussion for
months, and returned the language to its roots, leaving Pascal as a
largely useless accademic language, rather than evolving it into a SIL
which could compete head-on with C, as we had expected. Very odd. I
wasn't at the meeting where it happened, but I worked with someone on
the committee and I heard about it shortly afterwards. It was quite a
shocker for us, actually, since we were a Mac shop at the time, and we
were pretty heavily tied into the Pascal world. He said it was a
"feeding frenzy" -- they reviewed the list of proposed extensions, one
by one, and killed every one.
After that meeting, it was immediately obvious that the language was
dead, of course -- you couldn't even write a storage manager in
un-extended Pascal; in consequence, every serious commercial
implementation included non-standard extensions, which meant that nearly
every implementation was different. If it was to remain a serious
contender, that _had_ to get fixed, by extending the standard definition
of the language; with the conscious decision not to extend it, the
committee killed the language.
By the way, it's tough to write a storage manager without casts, and
it's tough to write a "printf" replacement without varargs; those were
two of the long-anticipated extensions which were unexpectedly killed.
I don't recall all the others.
large C program to work _at_ _all_ was a major challenge in those
days; worrying about how well it would work 20 years later was
totally outside the picture.
As I said, banking software actually did have to work with Y2K dates
back then. So did computers in many industries with long-term
inventory and projects, such as building houses or bridges, or leasing
farm equipment.
Asserting that "routine maintenance" of software in 1980 should have
included fixing Y2K problems is a little absurd, I think.
It did include that, and I personally performed it. This was for
things like municipal billing systems and first generation grocery
scanners.
In 1980? I thought they came later. Oh, well, my memory isn't what it
used to be (and in fact it never was, either).
- Jed