Andy Wood wonders:

> Maybe they were just optimists and figured that by the 
> time they really needed it, the operating system would 
> have been updated to provide it.

As I said, I don't know where Poughkeepsie got the idea.
Maybe they thought of it all by themselves. We did not
care at the time.

I also don't know when it first appeared in the response 
to a TIME macro in MVS. But it was a fairly common idea,
at least in the financial (read: banking) community where
debt instruments had maturity dates past 2000. The issue
that eventually came to be called "the Y2K problem" was,
to put it bluntly, old and boring news by the time the 
rags, mags, pundits and self-appointed experts discovered
it. The issue was being raised by SAS customers as early
as the Fall of 1978. I remember testing date INFORMATs 
and FORMATs for yyyy>1999 when I was finishing 79.1. I'm
sure those tests were cursory, but everything that should
have worked did. 

> If the century indicator concept existed well before 
> then, it leaves me wondering where they got the actual 
> value from if the operating system did not provide it 
> (other than perhaps via the TOD/STCK value).

We didn't get it from anywhere. It was an open question
and much discussed (with certain interested parties). 
More than one OEM software vendor (in other words, not
just SAS Institute) and high-end customers were involved.
There were lots of ideas. Many folks contributed. It was
not a "big deal" and did not occupy hours of meetings or
result in conflicting requirements. It was just a long-
range issue that kept coming up when the right parties
were gathered around (SCIDS, MVS Steering Committee [I
think Hank Harrison was one of the instigators of this],
dinner, coffee after dinner, etc.).

There was some question which representation would be
best to represent dates>1999: yyyydddF or 0cyydddF. The
several IBMers with whom folks like us at GUIDE on some
of the futures task forces discussed the issue eventually
decided that 0cyydddF was the best approach because code 
to interpret the century nybble or byte could be put into 
place sooner rather than later, and would work correctly 
with operating systems that did not yet return 0cyy or
ccyy (whereas if MVS started returning 19yy, then lots
of code was going to break). It was ugly, but it kept
old code running and new code working on old operating
systems. And so forth. Thus, the new algorithm for date
exhibition would be: Year = 1900 + (100*c) + yy (where
ddd, of course, remained interpreted as it always had
been).

So that's how it got put into all the stuff I did,
starting in the late 1970s and early 1980s. GUIDE
was not the only organization discussing internal
issues such as this (related to what came to be 
called the "Y2K" problem), and this was not any
sort of secret. Lots of software vendors got their
ducks in a row long before it actually hit the 
fan as far as the morons were concerned. 

In other words, it was simply common knowledge, as
early as 1981 I think, that MVS was going to use the
0cyydddF or ccyydddF date representation (we were not
sure which, but the heavy hitters at least would be 
ready whenever IBM got around to documenting it). 

Of course, to some people it wasn't "common knowledge."
But folks were no more interested in hearing about the
two-digit year problem in 1981 than they were in 1995.
Nobody (but some banks and a lot of software vendors)
cared. It would not hit the fan until much later, as I
am sure everyone now well remembers.

--
WB

----------------------------------------------------------------------
For IBM-MAIN subscribe / signoff / archive access instructions,
send email to [email protected] with the message: GET IBM-MAIN INFO
Search the archives at http://bama.ua.edu/archives/ibm-main.html

Reply via email to