On Dec 18, 2005, at 3:58 PM, Markus Kuhn wrote:

Why is it important that our clocks give a +/- 30 minutes
approximation of local astronomical time?

Unimportant for some purposes.  Important for others.  Who ranks the
relative merit?

The key issue is surely a question of interoperability.  As it stands
now (and has for all of human history), the fundamental standard is
the Earth itself (a mirror image of the Sun in the sky).  This was
true for local apparent time and is true for mean standard time
zones.  The ITU proposal would replace an extremely portable and
durable and recoverable standard (as simple as measuring noon in a
particular location on a given date) with a completely ad hoc
relationship to some remote ensemble of hyper-technical devices.  A
bright middle school student could synchronize a clock against mean
solar time - from first principles.  On the other hand, a nobel
laureate might botch the recovery of the monotonic count that is TAI
should that count ever become lost.

Sure, there seem clear advantages in having midnight happen when
most people are asleep, or at least outside extended business
hours. So having everyone on UT is not very attractive for those
living more than +/-3 hours from the prime meridian. But since most
of us sleep at least 6 hours and are not (supposed to be ;-)
working for at least 15 hours each day, such a simple requirement
could still be achieved with just 3-5 timezones worldwide.

Sure, why not?  But this is completely orthogonal to the question of
the relative importance of leap seconds.  No matter how wide we make
the time zones, no matter how large an amplitude we allow for
daylight saving/"summer" time, these are still periodic (or purely
constant) effects.  A leap jump - whether a second or an hour -
remains a secular effect.

Which is bigger?  An hour per year?  Or two milliseconds per century?

Ill-posed question.  It ain't an hour a year compared to an hour
after 600 years.  Spring forward AND fall back.  What it is REALLY is
zero hours (+1-1) per year.  As small as 2 ms per 100 years is - zero
per whatever is smaller.  Smaller and more negligible than is the
need for leap seconds.

The crudest approach would probably be

  a) N+S America:               use local time of Cuba
(~ UT - 5.5 h)
  b) Europe/Africa/Middle east: use local time of Poland/Greece
(~ UT + 1.5 h)
  c) Asia + Australia:          use local time of Thailand
(~ UT + 6.5 h)

Sure, the hours of darkness would vary substantially within each of
these zones. But they do already *today* for much of the world,
thanks to summer/winder. China understood this a long time ago.

I like the chutzpah of it!  The pure political theater of trying to
convince Washington to keep Havana time, or the serious surrealism of
the Senegalese Assemblee Nationale debating the adoption of
Peloponnesian Mean Time.

Whatever China understands, it amounts to a constant offset, not the
slope of a trend line.  We aren't talking about apples and oranges,
we're talking about apples and the rate of change of qumquats.  In
fact, it is remarkable that the existence of a significant
acceleration (second derivative or quadratic effect) in the need for
leap seconds is being asserted as a bogus justification for not
issuing leap seconds at all.

Rob Seaman
National Optical Astronomy Observatory

Reply via email to