REH forwarded:
> Questioning the calendar
>    By Stephen Jay Gould
...
> MYSTERIES OF THE CALENDAR
> Why do we base calendars on cycles at all? Why do we recognize a
> thousand-year interval with no tie to any natural cycle?

Being "a distinguished professor of zoology", Stephen Jay Gould should know
why:  We count in the decimal system because we have 10 fingers.  Thus, we
have partitioned the time scale in decades, centuries and millenniums of
10, 100 and 1000 years.  If we would have 8 fingers, we would count in the
octal system, and a millennium would have 8^3 = 512 years.  Computers have
2 'fingers', so to speak (voltage states 0 and 1), so they count in the
binary system.  Decimal numbers and calculations have to be programmed,
and that's where the millennium bug enters the game.  Merely 4 bytes would
be enough to encode the seconds of 136 years (starting e.g. from 01-01-1904)
and to calculate the _4-digit_ year, month, day, hour, minute and second
from this 4-byte-value.  Actually, that's how Macintosh computers handled
time data from the start -- that's why Macs don't have the Y2K bug.
Unfortunately, the programmers of IBM et al. didn't have this idea, so they
wasted 2 bytes for the _2-digit_ year only, and that's why we'll be in a mess
pretty soon.  It would only be fair to use a part of Gates' $80 billion to
clean up the mess...

--Chris

Reply via email to