Steve,

Thanks for the interesting story. It seems the same catfight is going on today about whether to kill the whole leapsecond idea. I hear you one astronomical story and raise you another.

The ancient Maya were fairly good astronomers and got the year right within a quarter day of 365. But, the base-20 number system came out even with 18 months of 20 days or 360 days with five leap days (the haab) left over. The asonomers battled with the priests who had a lunar calendar with 9 months of 30 days or 270 days. The least common multiple of 365 and 270 days is about 52 years, called the calendar round. So what to do about the haab? Apparently, they partied, slaughtered a neigboring tribe and had a good time. Far as I know, the haab is still in the Maya calendar and has been since before CE 800. Our leapsecond is only 33 years old.

Yes, I know about the bad old days prior to 1972. Every few months the retrospective frequency deviation between UT1 and the atomic timescale was published in Proc. IEEE. If you didn't have a cesium clock and microstepper, you didn't play the game. I don't remember when the original WWV/H timecode format came into use, but it was well before Pub. 432 in 1979. The current format was adopted when the station timecode generators were replaced circa 1986.

Dave

Steve Allen wrote:
David L. Mills wrote:

The DUT1 does take on negative and positive values as the TAI offset
wanders; however, the broadcast timecodes of WWV/H and WWVB have only
four bits, one of them the sign. There is no provision for a delete
second. So, if the UT1 moves north instead of south, more than 0.7 s
would be lost.


In that 0.7 s limit is an interesting story of disconnect between
international scientific unions.  In 1969 the working group of the
CCIR (ancestor of the ITU-R) responsible for broadcast time signals
ignored calls by scientific unions to hold an interdisciplinary
conference to address problems with the existing scheme of elastic
seconds and millisecond leaps.  The working group recommended leap
seconds, and in early 1970 the CCIR plenipotentiary assembly agreed to
implement leap seconds less than two years later.

At the IAU General Assembly in the summer of 1970 it was revealed that
the CCIR had not even sent a letter to the IAU informing them of the
impending change in broadcast time signals.  Therefore the IAU had no
standing on which to make any official response to the CCIR.  The
proceedings of the 1970 IAU GA are very interesting historical
reading, for it reveals that the supposedly gentile and orderly
interaction of international agencies had failed utterly.  Despite the
attempts to filter the proceedings suitably for publication, they give
the impression that there was quite a bit of shouting going on.

In the absence of input from the astronomers, the physicists and radio
scientists apparently presumed that it would be possible to hold UTC
within 0.7 seconds of UT1, and the NBS created the scheme we still
have with WWV which cannot encode larger differences.  However in 1972
the earth's crust was spinning almost as slowly as it had in 1913
(ironically the crustal rotation has accelerated almost uniformly
since 1972).  Leap seconds were needed about every 350 days, but the
measurement, data processing, and prediction of earth rotation was
quite immature (by current standards).  There was no way to ensure
that DUT1 could be kept under 0.7 s.

The leap second scheme of the CCIR went into effect in 1972, and the
value of DUT1 almost immediately went outside the specified limits.
In 1973 at the next IAU GA the astronomers offered a number of
improvements to the rules for UTC, and the CCIR incorporated them in
the 1974 version of the UTC standard.  For operational purposes those
are the same rules still in effect today.

Is there a moral in this story which is relevant to the current
efforts to redefine UTC?


_______________________________________________
questions mailing list
[email protected]
https://lists.ntp.isc.org/mailman/listinfo/questions

Reply via email to