JUST A LITTLE MORE INFO on the Y2K problem, note letter at the bottom by a
software engineer


Subject: The "Y2K-bugs-are-not-just-a-legacy-problem" FAQ
From: Gary Lawrence Murphy <[EMAIL PROTECTED]>
Date: 29 Mar 1998 23:43:11 -0500
Just when you thought it was safe ... I found this on the USENET this
evening and thought it might be of interest:
From: Zooko Journeyman <[EMAIL PROTECTED]>
Subject: The "Y2K-bugs-are-not-just-a-legacy-problem" FAQ
Date: 30 Mar 1998 02:24:53 +0200
Organization: XS4ALL Internet B.V.
[Greetings, Usenetters.  This is an article i've just hacked up
that i intend to send to reporters and publications when i read
that the year 2000 bug is caused by legacy programs.  I argue
that a "sizeable fraction" of y2k bugs were written in the
1990's, and i attempt to start a trend of calling them "y2k
bugs" instead of "The Y2k Bug" since there are many of them and
they come in many different flavors and they must be fixed
individually.  All flames, compliments, comments, criticisms,
questions and answers are welcome at "[EMAIL PROTECTED]".  --Z]
Dear Sir or Madam:
I am writing you in reference to a recent article of yours
which propagated a common misperception about year 2000 bugs.
I am a professional software engineer, and my motivation in
writing this correction is solely to help inform the public
about this very important issue.  Permission is granted to
reproduce, distribute, and use this article in any way.
It is often stated (even by knowledgeable engineers, analysts,
and reporters) that year 2000 bugs are caused by programs
written in the 1960's, 1970's and 1980's.  This misperception
is dangerous, as it encourages people who depend only on modern
programs to think that they are not at risk.  In fact, year
2000 bugs abound in programs from all eras, including programs
written during the 1990's.
JavaScript, JScript, Netscape Navigator, Netscape Communicator,
and Microsoft Internet Explorer suffer from y2k bugs
themselves, and they also make it complicated for a programmer
to write y2k-safe code using those systems:
American Megatrends was shipping its widely used PC BIOS
software with year 2000 bugs until as late as July 1995 (or
even later?  The web pages don't precisely state.):
Award was shipping _its_ widely used PC BIOS software with year
2000 bugs until as late as November 1996:
(I bought a brand new PC in June of 1997 and its BIOS had a
year 2000 bug.)
A y2k bug was discovered in a BeOS application (BeOS was first
made public in the second half of the 1990's):
The popular AltaVista search engine uses 2-digit year fields to
constrain the dates of your search.  If there are any scripts
out there which use AltaVista to e.g. find all articles on a
certain topic posted during the last week, those scripts will
break during the first week of the year 2000.
These are only a few examples.  Almost certainly there are
millions of year 2000 bugs that remain undiagnosed or
unreported, and by my estimate a sizable fraction of them were
written during the 1990's (since the majority of code currently
in use today was written during the 1990's).
You may well ask "Why the heck would someone writing code in
the 1990's create a year 2000 bug?".  There are 3 reasons:

1.  Legacy data, legacy interfaces.  Often new programs are
written, and the first task of these new programs is to read in
the data from the old programs that they are replacing.  This
means that the new programs usually use the same data formats.
Also new programs are sometimes required to interoperate with
old programs, which means that they often use the same data

2.  It's just a bug, like all bugs.  All bugs appear stupid
once they are identified, but when you are actually
constructing a complex system, many a bug will be generated out
of misunderstandings or mistakes.  Year 2000 issues are not
nearly as simple as the media tends to indicate.
For example, let's say that you are writing a program in 1997
in a modern programming language such as JavaScript or Perl.
You invoke the standard routine to return the current year.  It
returns "97".  Now you want to use this information for your
own calculations, and export the information for the benefit of
the user of your program.  What should you do?  Leave it as
2 digits?  Prepend the string "19" to the year?  Add the number
1900 to the year?  Do a check to see if the year is less than
50, and then add the number 1900 to it if it is greater than 50
and add the number 2000 to it in that case?
The answer varies depending on which language you are using,
which _version_ of the language (in the case of JavaScript),
and what the users of your program expect to see.

3.  And perhaps most important: human/computer interface.
Humans routinely use two digits to indicate a year.  Think back
to the last time you wrote a date onto a paper document or
typed a date into a computer.  More than likely, you wrote
"98".  When other humans read that number, they will use
context to determine whether you meant 1898, 1998, 2098, or 98.
But computers are very bad at using context to make such
distinctions.  All software has to interact with humans at some
points, and almost all software which uses dates allows the
humans to enter 2-digit dates.  It then has to infer somehow
whether that date belongs in the 20th century or the 21st (or
the 19th or others).  This decision is still a problem for
programs written in 1998 as it was for programs written in
I hope that this letter has been informative to you.  I am
making a habit of sending this letter via e-mail to each
publication or reporter who unwittingly propagates the "Y2k is
a legacy problem" myth.  Feel free to contact me for more

Zooko, Journeyman Engineer

Reply via email to