-Caveat Lector-

 What is the inventor of ASCII and coinventor of COBOL doing
to
 prepare?

 Well, he bought two $200 water filters from Denmark for
one.


 Link:
http://www.washingtonpost.com/wp-srv/WPlate/1999-07/18/

 * * * * * * * * *

 We are knocking at the door of a high-rise apartment in
Baileys
 Crossroads, with a question so awful we are afraid to ask
it. We do
 not
 wish to cause a heart attack.

 A woman invites us in and summons her husband, who shuffles
in from
 another room. She is 78. He is 82. They met in the 1960s as
 middle-management civil servants, specialists in an aspect
of data
 processing so technical, so nebbishy, that many computer
professionals
 disdain it. He was her boss. Management interface blossomed
into
 romance.
 Their marriage spans three decades. They are still in love.

 "You know how we use Social Security numbers alone to
identify
 everyone?"
 she says. She points proudly to her husband. "That all
started with
 this
 kid!"

 The kid has ice cube spectacles and neck wattles. He has
been retired
 for
 years. Some of his former colleagues guessed he was
deceased. His
 phone
 is unlisted. We located him through a mumbled tip from a
man in a
 nursing
 home, followed up with an elaborate national computer
search.
 Computers--they're magic. . . .


 Here is what we have to ask him: Are you the man who is
responsible
 for
 the greatest technological disaster in the history of
mankind? Did you
 cause a trillion-dollar mistake that some believe will end
life as we
 know it six months from now, throwing the global economy
into a
 tailspin,
 disrupting essential services, shutting down factories,
darkening vast
 areas of rural America, closing banks, inciting civic
unrest, rotting
 the
 meat in a million freezers, pulling the plug on
life-sustaining
 medical
 equipment, blinding missile defense systems, leaving ships
adrift on
 the
 high seas, snarling air traffic, causing passenger planes
to plummet
 from
 the skies?

 Obligingly, he awaits the question. . . .

 Technology has been the propulsive force behind
civilization, but from
 time to time technology has loudly misfired. In the name of
progress,
 there have been profound blunders: Filling zeppelins with
hydrogen.
 Treating morning sickness with Thalidomide. Constructing
aqueducts
 with
 lead pipes, poisoning half the population of ancient Rome.
Still,
 there
 is nothing that quite compares with the so-called
"Millennium Bug." It
 is
 potentially planetary in scope. It is potentially
catastrophic in
 consequence. And it is, at its heart, stunningly stupid. It
is not
 like
 losing a kingdom for want of a nail; it is like losing a
kingdom
 because
 some idiot made the nails out of marshmallows. . . .

 Never has a calamity been so predictable, and so
inevitable, tied to a
 deadline that can be neither appealed nor postponed.
Diplomacy is
 fruitless. Nuclear deterrence isn't a factor. This can't be
 filibustered
 into the next Congress. . . .

 The search for a culprit is an honored American tradition.
It
 nourishes
 both law and journalism. When things go bad, we demand a
fall guy. A
 scapegoat. A patsy.

 Today we'll search for one, and find him. . . .

 The Y2K problem wasn't just foreseeable, it was foreseen.

 Writing in February 1979 in an industry magazine called
Interface Age,
 computer industry executive Robert Bemer warned that unless
 programmers
 stopped dropping the first two digits of the year, programs
"may fail
 from ambiguity in the year 2000."

 This is geekspeak for the Y2K problem.

 Five years later, the husband-wife team of Jerome T. and
Marilyn J.
 Murray wrote it much more plainly. In a book called
"Computers in
 Crisis:
 How to Avoid the Coming Worldwide Computer Systems
Collapse," they
 predicted Y2K with chilling specificity.

 Few people read it. The year was 1984, and to many, the
book seemed
 very 1984-ish: a paranoid Orwellian scenario. ComputerWorld
magazine
 reviewed it thus:

 "The book overdramatizes the date-digit problem. . . . Much
of the
 book can be overlooked."

 How could we have been so blind?

 Basically, we blinded ourselves, like Oedipus. It seemed
like a good
 idea at the time. . . .

 Why didn't people realize earlier the magnitude of the
problem they
 were creating?

 And when they did realize it, why was the problem so hard
to solve?
 We sought the answer from the first man to ask the
question.

 Robert Bemer, the original Y2K whistleblower, lives in a
spectacular
 home
 on a cliff overlooking a lake two hours west of a major
American city.
 We
 are not being specific because Bemer has made this a
condition of the
 interview. We can say the car ride to his town is
unrelievedly
 horizontal. The retail stores most in evidence are
fireworks stands
 and
 taxidermists.

 In his driveway, Bemer's car carries the vanity tag
"ASCII." He is the
 man who wrote the American Standard Code for Information
Interchange,
 the
 language through which different computer systems talk to
each other.
 He
 also popularized the use of the backslash, and invented the
"escape"
 sequence in programming. You can thank him, or blaspheme
him, for the
 ESC
 key.

 In the weenieworld of data processing, he is a minor deity.

 We had guessed Bemer would be reassuring about the Y2K
problem.

 Our first question is why the heck he recently moved from a
big city
 all
 the way out to East Bumbleflop, U.S.A.

 It's a good place to be next New Year's Eve, he says. From
a kitchen
 drawer he extracts two glass cylinders about the size of
the
 pneumatic-tube capsules at a drive-through teller. Each is
filled with
 what appears to be straw.

 "They're Danish," he says. "They cost $500. We ran water
with
 cow[poop]
 through them and they passed with flying colors."

 They're filters, to purify water. If Y2K is as bad as he
fears, he
 says,
 cocking a thumb toward his backyard, "we can drain the
lake."

 Bemer is 79. He looks flinty, like an aging Richard Boone
still
 playing
 Paladin.

 He has started a company, Bigisoft, that sells businesses a
software
 fix
 for the Y2K problem. So, for selfish reasons, he doesn't
mind if there
 is
 widespread concern over Y2K, though he swears he really
thinks it is
 going to be bad. That's why he has requested that we not
mention the
 town
 in which he lives. He doesn't want nutballs descending on
him in the
 hellish chaos of Jan. 1, somehow blaming him.

 Who, then, is to blame?

 Bemer rocks back in his chair and offers a commodious
smile.

 In one sense, he says, he is.

 Binary Colors

 In the late 1950s, Bemer helped write COBOL, the Esperanto
of computer
 languages. It was designed to combine and universalize the
various
 dialects of programming. It also was designed to open up
the exploding
 field to the average person, allowing people who weren't
 mathematicians
 or engineers to communicate with machines and tell them
what to do.
 COBOL's commands were in plain English. You could instruct
a computer
 to
 MOVE, ADD, SEARCH or MULTIPLY, just like that.

 It was a needed step, but it opened the field of
programming, Bemer
 says,
 to "any jerk."

 "I thought it would open up a tremendous source of energy,"
he says.
 "It
 did. But what we got was arson."

 There was no licensing agency for programmers. No
apprenticeship
 system.
 "Even in medieval times," Bemer notes dryly, "there were
guilds." When
 he
 was an executive at IBM, he said, he sometimes hired people
based on
 whether they could play chess.

 There was nothing in COBOL requiring or even encouraging a
two-digit
 year. It was up to the programmers. If they had been better
trained,
 Bemer says, they might have known it was unwise. He knew.

 He blames the programmers, but he blames their bosses more,
for caving
 in
 to shortsighted client demands for cost-saving.

 "What can I say?" he laughs. "We're a lousy profession." .
. . .


 The longer a program is used, the larger the database and
supporting
 material that grow around it. If, say, a program records
and
 cross-references the personnel records in the military, and
if the
 program itself abbreviates years with two digits, then all
stored
 data,
 all files, all paper questionnaires that servicemen fill
out, will
 have
 two-digit years. The cost of changing this system goes way
beyond the
 cost of merely changing the computer program.

 It's like losing your wallet. Replacing the money is no
sweat.
 Replacing
 your credit cards and ATM card and driver's license and
 business-travel
 receipts can be a living nightmare.

 And so, even after computer memory became cheaper, and data
storage
 became less cumbersome, there was still a powerful cost
incentive to
 retain a two-digit year. Some famously prudent people
programmed with
 a
 two-digit date, including Federal Reserve Chairman Alan
Greenspan, who
 did it when he was an economics consultant in the 1960s.
Greenspan
 sheepishly confessed his complicity to a congressional
committee last
 year. He said he considered himself very clever at the
time. . . .


 A group did adopt a written standard for how to express
dates in
 computers.

 We are looking at it now.

 It is a six-page document. It is so stultifying that it is
virtually
 impossible to read. It is titled "Federal Information
Processing
 Standards Publication 4: Specifications for Calendar Date."
It is
 dated
 Nov. 1, 1968, and took effect on Jan. 1, 1970, precisely
when Brooks
 says
 the lines on the graph crossed, precisely when a guiding
hand might
 have
 helped.

 On Page 3, a new federal standard for dates is promulgated.
. . .

 Federal Information Processing Standards Publication 4,
Paragraph 4
 and
 Subparagraph 4.1, is another of those statements. Here it
is, in its
 entirety:

 Calendar Date is represented by a numeric code of six
consecutive
 positions that represent (from left to right, in high to
low order
 sequence) the Year, the Month and the Day, as identified by
the
 Gregorian
 Calendar. The first two positions represent the units and
tens
 identification of the Year. For example, the Year 1914 is
represented
 as
 14, and the Year 1915 is represented as 15.

 Ah.

 The Y2K problem.

 Set in stone.

 By the United States government.

 FIPS 4, as it was called, was limited in scope. It applied
only to
 U.S.
 government computers, and only when they were communicating
from
 agency
 to agency. Still, it was the first national computer date
standard
 ever
 adopted, and it influenced others that followed. It would
have
 affected
 any private business that wanted to communicate with
government
 computers. It might have been a seed for change, had it
mandated a
 four-digit year. . . .

DECLARATION & DISCLAIMER
==========
CTRL is a discussion and informational exchange list. Proselyzting propagandic
screeds are not allowed. Substance�not soapboxing!  These are sordid matters
and 'conspiracy theory', with its many half-truths, misdirections and outright
frauds is used politically  by different groups with major and minor effects
spread throughout the spectrum of time and thought. That being said, CTRL
gives no endorsement to the validity of posts, and always suggests to readers;
be wary of what you read. CTRL gives no credeence to Holocaust denial and
nazi's need not apply.

Let us please be civil and as always, Caveat Lector.
========================================================================
Archives Available at:
http://home.ease.lsoft.com/archives/CTRL.html

http:[EMAIL PROTECTED]/
========================================================================
To subscribe to Conspiracy Theory Research List[CTRL] send email:
SUBSCRIBE CTRL [to:] [EMAIL PROTECTED]

To UNsubscribe to Conspiracy Theory Research List[CTRL] send email:
SIGNOFF CTRL [to:] [EMAIL PROTECTED]

Om

Reply via email to