Re: Epsilon proposal vs integer UTC-TAI

2003-01-30 Thread Mark Calabretta
On Thu 2003/01/30 18:52:51 -, Ed Davies wrote
in a message to: [EMAIL PROTECTED]

>But, would it be practical to change now?  For example, is it not
>likely that there are many existing systems which assume UTC-TAI
>is an integer?

Yes, this is what I meant when I said that the burden will fall on
clockmakers and timekeeping software, and why I expect a decadal
leadin.

However, the French Revolutionaries did it faster!
http://webexhibits.org/calendars/calendar-french.html
http://www.ce.ufl.edu/~chiep/metric.html

Mark Calabretta
ATNF



Re: Calabretta's 86400 s + epsilon day proposal

2003-01-30 Thread Mark Calabretta
On Thu 2003/01/30 15:39:26 -, Markus Kuhn wrote
in a message to: [EMAIL PROTECTED]

>If you make every UTC day 86400 + epsilon(date) days long, then life
>gets more difficult for people who broadcast standard frequencies such
>as 50. Hz TV sync signals, because now you can't simply say that
>you start a new TV frame exactly at the start of a new second (with UTC
>you can, even across leap second!).

An interesting point, I don't claim to have any expertise in this area
but can think of several possible answers:

1) If it's not important for such signals to be synchronized with the
   start-of-day (i.e. mean solar day, and I can't see straightaway why
   it would be), then just tie them to TAI.

2) If it is important, then eventually the lengthening length-of-day
   will matter.  If the aim is to have an integral number of cycles in
   one mean solar day (again, I can't imagine why) then the simplest
   solution would be to adjust the frequency ever so slightly.  This
   could either go up or down so that up to 10ms was added or subtracted
   (half the 50Hz period).  The relative frequency change is 10ms in
   86400+Epsilon, where Epsilon only increases, i.e. less then 1 part
   in 8,640,000.

> Note that the vast majority of
>critical frequencies we have are an integer number of hertz. It is very
>convenient to adjust these by triggering an oscilloscope from the 1
>pulse-per-second output of an atomic clock or time signal receiver and
>then observe the phase wander of your oscillator. Such adjustments would
>break down if UTC clocks would start to have now 1 pulse-per-(something
>that is an SI second most of the time).

>From point (1) above, there would be a predictable glitch once per day -
it doesn't seem too bad to me.

>From point (2) above, the phase drift would be at most 180 degrees
per 86400s, or 0.125 deg per min.

>Even if, a procedure would still have to be put into place to define the
>function epsilon(date). How long in advance would IERS have to announce
>this function?

Yes, the current leap-second machinery could easily be adapted.  Epsilon
updates would not be as frequent as the current leap-second announcements.
However, as I mentioned originally, if non-secular variations in the
Earth's rotation were ignored, thus potentially allowing UT1 and UTC
to depart by several seconds over short periods (which I personally think
is reasonable), then Epsilon would be predictable because the Earth's
secular deceleration is accurately known.

>If I understood you correctly, all you propose is to replace the full
>leap second at the end of a day every few years by a couple of leap
>microseconds at the end of the every UTC day, right? That sounds

A couple of thousand leap-microseconds, right.  This is what the
insertion of Epsilon = 2000us would look like (500us ticks), conventional
view on the left and my preferred, but equivalent view on the right:

   UTC   UTC

   23:59:59.999000   23:59:59.999000
   23:59:59.999500   23:59:59.999500
   23:59:60.00   24:00:00.00   <- Epsilon insertion begins
   23:59:60.000500   24:00:00.000500
   23:59:60.001000   24:00:00.001000
   23:59:60.001500   24:00:00.001500
   00:00:00.00   00:00:00.00   <- Epsilon insertion ends
   00:00:00.00   00:00:00.000500

>slightly more complicated to me and doesn't fundamentally eliminate the
>problem that there still will exist UTC time stamps after 23:59:59.999...

It does eliminate several fundamental problems, as originally listed,
and introduce others which are not so fundamental, as originally listed.
On balance, I think the wins exceed the losses, and I am thinking
particularly of the longer term.  I don't see that it matters that UTC
timestamps might exceed 24:00:00.

>You could alternatively stretch the length of the UTC second at the end
>of each day, and this way your proposal would be similar to UTS, except
>that you make the correction every day whereas I prefer to limit it to
>the vicinity of each current UTC leap second. But pulse-per-second
>signals would experience a phase jump at the end of every day ... :-(

UTS seems like a practical solution, or perhaps "bandaid" might be a
better description, for systems which don't handle leap-seconds properly.
Given that at least some of us don't want to see any changes made to
UTC in the short term I would endorse it as an interim fix for such
systems.

Mark Calabretta
ATNF



Re: "names for points in time"

2003-01-30 Thread Ken Pizzini
On Fri, Jan 31, 2003 at 12:28:28AM +, [EMAIL PROTECTED] wrote:
> >   I think
> > the historical fact that people used to treat the earth's rotation
> > as the ultimate reference for the passage of time is not the most
> > relevant factor in the determination of how we should measure time
> > going forward, though history and tradition do need to be honored
> > and not gratuitously thrown away.
>
> Here we may have to agree to disagree a bit here.

Perhaps, but I am still not sure that we really disagree... my main
goal in statements like the above is to pick at the question of
"why is spatial orientation important"; sometimes the only way I
know of to pry at latent assumptions, trying to shed light on them,
is to belligerently question some closely related irregularity.

I'm not trying to say that tying time to earth orientation is
fundamentally wrong; I am trying to ask that the reasons that people
feel that the relationship should exist should be made explicit.
_Maybe_ with the reasons made explicit, they will be seen to be
spurious, or maybe they will be seen to be important, whether for
technical or political/cultural reasons.  But I can't guess which
without prying the assumptions out into the light of day.


> Much of the driving force behind calendar/time reform has been driven
> by increased understanding of the spatial orientation of the Earth.

Right, that is the history of how we got where we are.  It is a
description of the tradition.  It does not illuminate any technical
merits or demerits of the situation.

> This increased tie between "names for points in time" and "spatial
> orientation of Earth" has been one of the fundamental factors behind
> calendar/time reform for a very good reason: it mattered to key people.

And the key people called upon their experts to solve specific
identified problems.  My beef with this is that I know how easy it
is to answer the question asked without questioning what the real
motive behind the question is.  I have several times experienced a
situation where I would answer a question as-asked, and later find out
that the querent could have both saved a lot of effort and obtained
a better result if only they had asked me "the question before".
What I mean by that is: they had hit a problem and figured out how
they could solve the problem if they could solve some subproblem;
I was asked a question about the subproblem rather than about the
first-experienced problem.  Sometimes I'm paying attention and notice
a strange "why would you want to do that?" aspect to such second-order
questions and seek out what it is that they're _really_ trying to do;
this leads to much happier outcomes, and is very similar to what I
am attempting to do here on LEAPSECS --- understand what the basic
problems that need solving are, rather than trying to just come up
with a fix for the challenge du jour.


> And the vast majority of the rest put up with years not having
> 360 days, or lunar months nor staying in sync with the solar year,
> mean solar vs local solar time, time zones, or leap days ... some
> objected when these reforms were introduced (a few still object)
> but most people have come around to the same idea.

There are some assumptions built in to our calendar system and
it leap year rules also (such as: the vernal equinox should fall
within two days of March 21st).  There are tradeoffs involved in
any attempt to make any pair of incommensurate methods of counting
(such as SI seconds and days, or days and years) peacefully co-exist.
The keeping of tradition is a valid consideration, I just want the
requirements of a system, and the reasons for why the requirements
are deemed to be important, to be clearly laid out.

> Now we have some people who seem to dislike uncertainly, wanting to
> reduce the relationship between names for points in time" and "spatial
> orientation of Earth".  I think it would be a unfortunate step backward
> if our calendar/time became less tied to the "spatial orientation of
> Earth", IMHO.

Okay.  I disagree with the "step backward" part of that statement
(even though it *is* a statement of opinion), but I can accept that
you would find it unfortunate to break the connection between the two.
So if I can hazard a restatement of your opinion into a form suitable
to what I'm after:
  A time standard should maintain a meaningful relationship between its
  measure of a day and the orientation of earth with respect to the sun
  because: on a technical level the choice of maintaining or breaking
  this coupling is a purely arbitrary one, and the choice of keeping
  the two coupled gives continuity with long standing tradition.

Is that accurate?  Did I miss something?  I find it a pretty good
argument myself, unless someone can show that the claim about the
choice being arbitrary from a technical point of view is wrong.
And while I don't personally see any reason why that claim should
be wrong, at 

UTC, leap seconds, and the BIPM, IAU, ITU et.al.

2003-01-30 Thread Neal McBurnett
Thanks to Steve for the reference:

  http://danof.obspm.fr/IAU_resolutions/Resol-UAI.htm

I reproduce below a resolution of more specific relevance to this
discussion that was made at that meeting of the International
Astronomical Union.  I'm glad the IAU is looking at a wider range of
options than the International Telecommunication Union
Radiocommunication Study Group is.

The IAU working group is due to report in mid-2003.  Who knows more
about this IAU working group and their discussions?

This brings up the broader question of who is involved in this
decision making process.

ITU CCIR Recommendation 460-4. (188) is available at
  http://www.cl.cam.ac.uk/~mgk25/volatile/ITU-R-TF.460-4.pdf

It is "RECOMMENDATION 460-4 STANDARD-FREQUENCY AND TIME-SIGNAL
EMISSIONS" and contains this language:

   UTC is the time-scale maintained by the BIPM, with assistance from
   the International Earth Rotation Service (IERS), which forms the
   basis of a coordinated dissemination of standard frequencies and
   time signals.

That leads me to believe that while the ITU wrote the recommendation
on how to *disseminate* UTC, the actual legal basis for determining UTC
rests with the BIPM, which is part of the General Conference on
Weights and Measures, which also handles the International System of
Units (SI).

  http://www.bipm.fr/enus/

Neal McBurnett http://bcn.boulder.co.us/~neal/
GPG/PGP signed and/or sealed mail encouraged.  Keyid: 2C9EBA60

IAU Resolutions Adopted at the 24th General Assembly (Manchester, August 2000)

 Resolution B2 Coordinated Universal Time

 The XXIVth International Astronomical Union General Assembly,

 Recognising

   1. that the definition of Coordinated Universal Time (UTC) relies
   on the astronomical observation of the UT1 time scale in order to
   introduce leap seconds,
   2. that the unpredictability of leap seconds affects modern
   communication and navigation systems,
   3. that astronomical observations provide an accurate estimate of
   the secular deceleration of the Earth?s rate of rotation,

 Recommends

   1. that the IAU establish a working group reporting to Division I
   at the General Assembly in 2003 to consider the redefinition of
   UTC,
   2. that this study discuss whether there is a requirement for leap
   seconds, the possibility of inserting leap seconds at
   pre-determined intervals, and the tolerance limits for UT1-UTC, and
   3. that this study be undertaken in cooperation with the
   appropriate groups of the International Union of Radio Science
   (URSI), the International Telecommunications Union (ITU-R), the
   International Bureau for Weights and Measures (BIPM), the
   International Earth Rotation Service (IERS), and relevant
   navigational agencies.

On Thu, Jan 30, 2003 at 04:25:22PM -0800, Steve Allen wrote:
> The resolutions that have established the change have happened over
> the past four meetings of IAU General Assembly.
>
> The most recent set of resolutions which switched things from the old,
> non-inertial definitions to the new ones is visible at
> http://danof.obspm.fr/IAU_resolutions/Resol-UAI.htm



Re: "names for points in time"

2003-01-30 Thread ut1-mail
> But "names for points in time" can have a meaningful relationship
> to the flow of time in the physical universe without necessarily
> having anything to do with spatial-orientation-of-earth.

Yes, one can construct meaning out of non-spatial-orientation-of-earth
related "names for points in time".  TAI is a nice time scale
that I'm rather fond of in certain aspects.  :-)

But what should "carry the day" (pun intended) in terms of things
like civil time, UTC, etc?

>   I think
> the historical fact that people used to treat the earth's rotation
> as the ultimate reference for the passage of time is not the most
> relevant factor in the determination of how we should measure time
> going forward, though history and tradition do need to be honored
> and not gratuitously thrown away.

Here we may have to agree to disagree a bit here.

Much of the driving force behind calendar/time reform has been driven
by increased understanding of the spatial orientation of the Earth.
This increased tie between "names for points in time" and "spatial
orientation of Earth" has been one of the fundamental factors behind
calendar/time reform for a very good reason: it mattered to key people.
And the vast majority of the rest put up with years not having
360 days, or lunar months nor staying in sync with the solar year,
mean solar vs local solar time, time zones, or leap days ... some
objected when these reforms were introduced (a few still object)
but most people have come around to the same idea.

Now we have some people who seem to dislike uncertainly, wanting to
reduce the relationship between names for points in time" and "spatial
orientation of Earth".  I think it would be a unfortunate step backward
if our calendar/time became less tied to the "spatial orientation of
Earth", IMHO.

For those who feel that "names for points in time" should be less
tied to the "spatial orientation of Earth", use something TAI-like.
For those who have problems with the chaotic / complex motions of
things such as the Earth, use something TAI-like.  For those who worry
about the unpredictable nature of leap seconds (or even anti-leap
seconds), use something TAI-like.

For me personally: my computers, and my precision clocks deal with
leap seconds rather well.  And I have no problem with accepting that
future "names for points in time" are uncertain.  I think that is a
fair cost for using a time scale that is tied into the "spatial
orientation of Earth".

All the above is just my humble opinion ... you opinion may vary.
I accept and respect that.

chongo () /\oo/\



Re: Telescope pointing and UTC

2003-01-30 Thread Steve Allen
On Thu 2003-01-30T17:04:30 -0700, Neal McBurnett hath writ:
> I searched the IAU web site, and the Astrometry commission at
> http://center.shao.ac.cn/IAU_COM8/ and found no reference to recent
> action.  Can you send a pointer with more specifics?
>
> I think sending it to the whole list would be helpful.

The resolutions that have established the change have happened over
the past four meetings of IAU General Assembly.

The most recent set of resolutions which switched things from the old,
non-inertial definitions to the new ones is visible at
http://danof.obspm.fr/IAU_resolutions/Resol-UAI.htm
Note the many instances of the date 1 January 2003.

Some resolutions before that are at
http://danof.obspm.fr/t5nwslt1.html

Some generally useful links

about ICRS
http://hpiers.obspm.fr/webiers/results/icrf/Icrs.html

about ICRF
http://www.iers.org/iers/products/icrf/
http://hpiers.obspm.fr/icrs-pc/

about HCRF
http://astro.estec.esa.nl/Hipparcos/catalog.html

about ITRS
http://www.iers.org/iers/earth/itrs/itrs.html

about ITRF
http://www.iers.org/iers/products/itrf/
http://lareg.ensg.ign.fr/ITRF/

--
Steve Allen  UCO/Lick Observatory   Santa Cruz, CA 95064
[EMAIL PROTECTED]  Voice: +1 831 459 3046 http://www.ucolick.org/~sla
PGP: 1024/E46978C5   F6 78 D1 10 62 94 8F 2E49 89 0E FE 26 B4 14 93



UTC reference

2003-01-30 Thread Rob Seaman
I don't have time at the moment to comment on the more substantial
discussions.  A quick clarification:

I said:

>> Leap seconds are discussed very prominently in a very short document.

Ken Pizzini asks:

> Are you referring to the GPS World article which triggered the
> current discussion, or some other document that I should know about?

I was referring to the definition of UTC:

"Standard-frequency and Time-signal Emissions",
CCIR Recommendation 460-4 (1986)

This is available (for a price) from http://www.itu.int.  It has been
asserted that this document is the fundamental controlling international
agreement.  Since different countries still tie their civil timescales
to other standards such as GMT, it is arguable whether a civil time
standard exists at all.

Rob Seaman
National Optical Astronomy Observatory



Re: History of IEEE P1003.1 POSIX time

2003-01-30 Thread ut1-mail
> Thanks for giving us this history.

You are most welcome.

The POSIX time dance was an interesting lesson is damage mitigation
vs. desire to do something better than existing common practice.

During the process some of the active POSIX P1003.1 members ended up
learning a lot more than they expected to learn about time.  :-)
I once gave a 45-minute BOF/evening talk on time scales and the
history of time in an effort to convince some people that time
was not a simple topic.

Of course the story of now Coordinated Universal Time came to
be UTC is another interesting lesson in politics.  :-)

> So the 32-bitness of time_t was at that time wired into Posix?

Nearly all implementations had time_t an int32_t.  It was not required.
There have been one implementation (Cray?) that used int64_t.  Some
implementations tried to get away with u_int32_t.  All that you could
portably depend on was 31 unsigned bits.

Even today many implementations use a 'long int' which is a 32 bit signed
value on many architectures.

> Quite so, although if the relationship is vague enough (i.e. the synodic
> month vs. the civil month), then we can sometimes force clean behavior.

Yes, we agree.  Leap years are one such example of forcing a cleaner
behavior.  IMHO, leap seconds are another.

=-=

Can anyone hazard a guess as what the status of leap seconds
and time scales will be in the next few years?  Do people think
that the relevant standards bodies will make a change?  Is so,
what do you think they might do?  Does any one particular proposal
have "an edge" where it matters - decision wise?

chongo () /\oo/\



Re: History of IEEE P1003.1 POSIX time

2003-01-30 Thread John Cowan
[EMAIL PROTECTED] scripsit:

> As one of the main people who worked on time related aspects of the
> IEEE P1003.1 POSIX standard, I'd like to make a few comments about how
> it came to be the way it is now.

Thanks for giving us this history.

>This proposal was largely accepted.  However the formula taking
>into account 100/400 leap year rule was rejected as being not
>necessary since 32 bit signed timestamps would run out years
>before the year 2100.

So the 32-bitness of time_t was at that time wired into Posix?

> People should be used to our chaotic Universe.  If they want their names
> for "points in time" to have some vague relationship to their physical
> universe, then they should not demand that process to determine the
> names for "points in times" be nice, clean, and infinitely predictable.

Quite so, although if the relationship is vague enough (i.e. the synodic
month vs. the civil month), then we can sometimes force clean behavior.

--
John Cowan  [EMAIL PROTECTED]  www.reutershealth.com  ccil.org/~cowan
Dievas dave dantis; Dievas duos duonos  --Lithuanian proverb
Deus dedit dentes; deus dabit panem --Latin version thereof
Deity donated dentition;
  deity'll donate doughnuts --English version by Muke Tever
God gave gums; God'll give granary  --Version by Mat McVeagh



Re: Leap seconds in the European 50.0 Hz power grid

2003-01-30 Thread Steve Allen
On Thu 2003-01-30T22:05:51 +, Markus Kuhn hath writ:
> > But the question arises as to why the spec
> > can't easily be changed to indicate that it is per TAI day.
>
> As long as UTC is as it is currently, you don't want to do this:

But I think that the further answer is this:
Should it be decided that civil time shall track TAI, there will be no
technical problem (and indeed, possibly great rejoicing) to change the
specification.

For the purposes of power grids:

The current forms of UTC and TAI are both acceptable.
Ongoing changes in the length of a second are not acceptable.
Fractional leap seconds are not acceptable.

Here I perceive pieces of a checklist of civil time strategies vs.
technologies affected by them.  This could be valuable.

> Plus remember the remarks above that UTC was for a long time far more
> easily available than TAI in Europe.

I believe that is true almost everywhere, and is a root of the problem
at hand.

--
Steve Allen  UCO/Lick Observatory   Santa Cruz, CA 95064
[EMAIL PROTECTED]  Voice: +1 831 459 3046 http://www.ucolick.org/~sla
PGP: 1024/E46978C5   F6 78 D1 10 62 94 8F 2E49 89 0E FE 26 B4 14 93



History of IEEE P1003.1 POSIX time

2003-01-30 Thread ut1-mail
As one of the main people who worked on time related aspects of the
IEEE P1003.1 POSIX standard, I'd like to make a few comments about how
it came to be the way it is now.

First I'd like to tell you about the early consensus on time; before
any of the POSIX time specs were written.  I'm not saying I agree or
disagree with that consensus.  These were just the cards that
were "glued to the table" that our time sub-committee was forced
to deal with:

Initially there was a proposal to require the timestamps returned from
system calls such as time(2) to yield the exact number of seconds
since the time known as the Un*x epoch.  This calculation included
the known leap seconds at the time.

Unfortunately at the time nearly all Un*x implementations did NOT
take seconds into account.  For example, there were a non-trivial
number of file system i-nodes and dump tapes that used a "trivial
seconds since the Epoch" calculation.

Distributing leap second tables and requiring hosts to be "on-line"
to receive them in those days (when UUCP over slow baud modems was
common) was not considered realistic.  It was felt expect vendors
and/or system administrators to maintain a complete leap second table
(including leap seconds announced in advance) was also considered
unrealistic.

The POSIX decision was to preserve existing practice.  That the
vast majority of time-stamp calculation methods ignored leap seconds .
The decision was to to continue to use a trivial seconds since the
Epoch" calculation that ignored leap seconds.

It was also decided that IEEE P1003.1 POSIX should not demand that
a host's clock be accurate.  In those early days the system time was
typically set by the system administrator's watch.  Your typical
Un*x hardware clock drifted like hell.  Many hosts used nothing
more than the occasional system administrator adjustment to fix
a really bad clock.  Eyeballing a second hand of a wall clock
was typical.  Use of rdate (*ick*), let alone the much better
ntp protocol was not a common practice.

The committee did not want the fact that the system clock may be
poorly set and/or rather inaccurate to make the system non-conforming.
They did not want to require vendors to fix or even improve their
clock systems.

In addition these "glued to the table" cards, there were a number
of unfortunate attitudes:

"Don't confuse people with UTC.  Everyone uses GMT and knows
what it means".

"Lets not complicate things by worrying about the fact that
the year 2100 is not a leap year."

"You mean the year 2000 is, but 2100 is not a leap year?"

"Everyone knows there are only 60 seconds in a minute."

"I'm lucky if my system's clock is accurate to the minute, so
 I could care less about sometime as small as a leap second".

"It takes hours, sometime days, for my EMail message to
 reach most people.  Why should I worry about something as
 small as a second?"

"What matters to me is just that POSIX systems produce the
 same ctime(3) string (i.e., Wed Jun 30 21:49:08 1993\n") when
 given the same time(2) time-stamp."

"SI?  TAI?  UT1?  I'm having trouble with using UTC instead
 of good old GMT!".

 Given these cards:

1) We produced a formula that calculated "seconds since the Epoch"
   without respect to leap seconds.  I.e., the formula did not
   take into account known leap seconds when converting a
   something like a 'struct tm' (where time is expressed in
   year, day, hour, minute, second) into a time-stamp.

   This proposal was largely accepted.  However the formula taking
   into account 100/400 leap year rule was rejected as being not
   necessary since 32 bit signed timestamps would run out years
   before the year 2100.

2) We defined the epoch as "1970 Jan 1, 00:00:00 UTC".

   This was defeated and UTC was replaced with GMT.

3) We defined things like tm_sec (the number of seconds after
   the minute) as ranging from 00 thru 60.

   This was defeated and the limit of 59 was restored.

Sometime later, when POSIX was a big fight over signals and
job control, our sub-committee:

1) Requested that POSIX epoch from GMT-based to a
   "1970 Jan 1, 00:00:00 UTC" Epoch.

   We argued that UTC was an international standard and
   therefore using UTC would make be required if POSIX
   were to become an ISO level standard.

   The draft standard was changed to use UTC in place of GMT.
   A comment was put into the standard that "GMT and
   UTC were effectively the same thing" (not MY choice
   of words ... but at least POSIX now said UTC!).

   It too a few drafts to change all of the GMT's to UTC's.
   Some GMT's came back by force of habit.

   And we did have to explain who "Coordinated Universal Time"
   was UTC and not CUT a few times.  

Re: What problems do leap seconds *really* create?

2003-01-30 Thread Clive D.W. Feather
John Cowan said:
>> Fact 2 is that the old 1980s pre-POSIX Unix manuals talked about GMT and
>> not UTC. This strongly suggests that the authors were unfamiliar with
>> both TAI and UTC. The "seconds" they refer to behave more like UT1
>> seconds than like TAI/SI seconds, i.e. they are Earth rotation angles
>> and not Caesium oscillations.
> Where do you see any reference in the old documentation to the rotation of
> the Earth?  The authors of those man pages were engineers, and they knew
> perfectly well what a second was and is (since 1967), and they certainly
> knew the difference between measuring/counting and encoding.

I was around, though on the margins, when the first POSIX standard was
being written. If there had been an awareness of the difference between UTC
and GMT, I am the sort of person who would have leaped on it in an attempt
to win a Weirdnix prize. I offer this as weak evidence in support of
Marcus - nobody was discussing stuff at this level of detail.

--
Clive D.W. Feather  | Work:  <[EMAIL PROTECTED]>   | Tel:  +44 20 8371 1138
Internet Expert | Home:  <[EMAIL PROTECTED]>  | Fax:  +44 870 051 9937
Demon Internet  | WWW: http://www.davros.org | Mobile: +44 7973 377646
Thus plc||



Re: Leap seconds in the European 50.0 Hz power grid

2003-01-30 Thread John Cowan
Markus Kuhn scripsit:

> I doubt that this is really the case. UCPTE is happy if it can guarantee
> that the grid time remains within 20 seconds of UTC.

What are the long-term guarantees?

--
Even a refrigerator can conform to the XML  John Cowan
Infoset, as long as it has a door sticker   [EMAIL PROTECTED]
saying "No information items inside".   http://www.reutershealth.com
--Eve Maler http://www.ccil.org/~cowan



Re: Leap seconds in the European 50.0 Hz power grid

2003-01-30 Thread Markus Kuhn
Steve Allen wrote on 2003-01-30 20:58 UTC:
> On Thu 2003-01-30T12:54:09 +, Markus Kuhn hath writ:
> > The UCPTE specification says that the grid phase vectors have to rotate on
> > long-term average exactly 50 * 60 * 60 * 24 times per UTC day.
>
> Obviously the grid frequency shift after leap seconds is annoying, and
> it is undoubtedly one of the reasons contributing to the notion of
> stopping leap seconds.

I doubt that this is really the case. UCPTE is happy if it can guarantee
that the grid time remains within 20 seconds of UTC. Leap seconds are
only a relatively minor reason for the power grid clock to deviate from
UTC temporarily. Remember that in a national or continental distribution
grid, power is transferred whenever there are phase differences between
parts of the grid. So if demand raises in one area, it will fall behind
in phase relative to the others and thereby it slowly pull the frequency
of the entire grid down until control loops detect this and compensate
the deviation from the target frequency by pulling rods a few
centimeters out of nuclear reactors all across the continent. First you
keep the short-term frequency constant, then you keep the voltage
constant, then you keep the power transfers in line with the contracts,
and only after you have fulfilled all these targets, you use what
degrees of freedom are left in the control space to keep the grid clock
synchronized, i.e the long-term frequency accurate.

> But the question arises as to why the spec
> can't easily be changed to indicate that it is per TAI day.

As long as UTC is as it is currently, you don't want to do this:

Firstly, there are zillions of clocks that use the power grid as their
reference oscillator, and you want them to run locked roughly to UTC,
because they are supposed to display local civilian time and not
something linked to TAI.

Secondly, in Europe, exact UTC-based civilian time was available for a
long time via LF transmitters such as DCF77, MSF, HBG, etc., not to
forget BBC-style beeps before news broadcasts and telephone speaking
clocks. TAI on the other hand has only relatively recently become
reasonably easily available automatically through GPS and NTP extensions
and would otherwise have to be manually looked up from tables. So TAI
was just far less practical, and in addition simply unknown to most
engineers.

My point was that leap seconds are not a problem in the power grid and
for power-grid controlled clocks.

About power-grid controlled clocks:

Around 1990, West Berlin was temporarily connected to what was then the
East European grid into which East Germany was integrated, which did not
provide a grid time that was kept long-term aligned with UTC. Customers
in West Berlin started to complain that their clocks suddenly needed to
be adjusted regularly. If the average frequency for a week was only
49.95 Hz, your alarm clock would go 10 minutes late by the end of the
week, which is definitely noticeable, especially if the same clock
before never needed any adjustment between power outages. The problem
persisted until East Germany (and now also its neighbors) was integrated
into the UCPTE.

> My power company cannot supply me with a reliability of 0.9997, so I can
> never see leap seconds from my household clocks.  I don't really
> believe that other power companies achieve it either

Unfortunately, I can't confirm that my supplier here in Cambridge can
either. However, in the urban centers of Bavaria where I grew up, power
outages where certainly far less frequent than leap seconds. Of the few
we ever had there, most outages were announced a week in advance by mail
because of local network work. I am being told that the North American
power grid does not have a particularly good reputation among
Continental power distribution engineers, so you probabaly shouldn't
assume that its reliability represents a high standard in international
comparison. (E.g., even solar wind has been known to drive transformers
in the US/CA grid into catastrophic saturation and bring the entire grid
to a collapse, something that UCPTE regulations have prevented by
requiring the installation of capacitors that eliminate continental DC
loops).

> So what is the value obtained by a specification like this?

Grid-powered clocks that in practice do not have to be adjusted, for
example. Note that these were long around before DCF77 and GPS receivers
became low-cost items. Even though embedded DCF77 receivers/antennas now
cost less than 15 euros and GPS receivers less than ~50-100 euros, it
still doesn't beat costwise a few 10 Mohm resistors for a voltage
divider directly from the 230 volt line to the spare input pin of a
clock microcontroller.

Plus remember the remarks above that UTC was for a long time far more
easily available than TAI in Europe. Only *very* recent power plants
have GPS receivers in the control system and could therefore use TAI as
a reference in theory, if they wanted. (My brother happens to set up on

Re: Subtracting two UTC/time_t timestamps under POSIX

2003-01-30 Thread Markus Kuhn
John Cowan wrote on 2003-01-30 20:36 UTC:
> > As far as I remember in the code, and in practice,
> > time_t was just a cute way to encode a date;
> > nothing more.
>
> You didn't routinely subtract two time_t values to get elapsed time in seconds?

For that purpose, having a leap-second-free second count is not the only
requirement. The far more important requirement is that you use a clock
that can't be adjusted by the sys-admins or by time daemons, and that is
available from startup, even before the system learns about UTC over
some external channel.

For this reason, POSIX offers today two separate clocks:

  CLOCK_REALTIME   is supposed to give the system's best available
   approximation of UTC (that's what you want to
   convert to hh:mm:ss timestamps and local time)

  CLOCK_MONOTONIC  is supposed to progress with the system's best
   available approximation of the SI second (that's
   what you want to use for measuring time intervals)

If you know TAI from boot-time, you can feed that into CLOCK_MONOTONIC,
but it can equivalently also just be the number of seconds since
power-up. If you have a reference clock, you can frequency control
CLOCK_MONOTONIC, but you must not phase-control it.

Literature:

  http://www.opengroup.org/onlinepubs/007904975/basedefs/time.h.html

Markus

--
Markus Kuhn, Computer Lab, Univ of Cambridge, GB
http://www.cl.cam.ac.uk/~mgk25/ | __oo_O..O_oo__



Re: Leap seconds in the European 50.0 Hz power grid

2003-01-30 Thread Steve Allen
On Thu 2003-01-30T12:54:09 +, Markus Kuhn hath writ:
> VERDIN phase tracking is perhaps a somewhat pathological case.

True, but I know of someone who built a household clock to use it, and
for someone living in a Navy base town during the early years of the
Reagan era that seemed like a prudent harbinger.

>The UCPTE
> specification says that the grid phase vectors have to rotate on
> long-term average exactly 50 * 60 * 60 * 24 times per UTC day.

Obviously the grid frequency shift after leap seconds is annoying, and
it is undoubtedly one of the reasons contributing to the notion of
stopping leap seconds.  But the question arises as to why the spec
can't easily be changed to indicate that it is per TAI day.  My power
company cannot supply me with a reliability of 0.9997, so I can
never see leap seconds from my household clocks.  I don't really
believe that other power companies achieve it either, so what is the
value obtained by a specification like this?

My power reliability is more like 0.999, and various folks in my
region recently experienced outages lasting from hours to weeks.  My
recent outage was 8 hours.  In order for a household device with
battery backup and internal clock to keep phase with the grid while it
was offline it would have needed an oscillator which would drift only
1 second in 20 days.

Are there any battery-backed devices, let alone household ones, with
internal clocks of this caliber which rely solely on power grid phase
locks to keep SI time?

--
Steve Allen  UCO/Lick Observatory   Santa Cruz, CA 95064
[EMAIL PROTECTED]  Voice: +1 831 459 3046 http://www.ucolick.org/~sla
PGP: 1024/E46978C5   F6 78 D1 10 62 94 8F 2E49 89 0E FE 26 B4 14 93



Re: Telescope pointing and UTC

2003-01-30 Thread Markus Kuhn
Steve Allen wrote on 2003-01-30 20:17 UTC:
> The specifications for the automatic telescope call for an object to
> appear within 10 arcsec of the field center after a slew.  This is
> congruent with what the telescope engineers can do with the flexure
> and hysteresis, but it obviously requires UT1 good to about 0.66 s for
> targets on the equator.  Therefore we do need DUT1, but not to more
> accuracy than it is provided.  Higher cost telescopes may be able to
> demand tighter specifications.

In addition, if you have a readily aligned telescope, DUT1 to 100 ms
should be more than exact enough to locate a bright guide star. Then let
the system make a quick CCD exposure of that and derive DUT1 with the
needed precision by looking at the coordinates of the brightest peak on
this image.

Even amateur equipment with CCD tracker does all that today fully
automatically, including the figuring out the telescope's alignment:

  http://www.meade.com/catalog/lx/8_10_lx200gps.html

In the various surveys among professional observatories that have been
reported here, have the manufacturers of microprocessor-controlled
amateur telescopes (which today typically come with integrated GPS
receivers) been asked, what |UT1-UTC| > 0.9 s would means for the many
thousands of systems that they have already sold?

Markus

--
Markus Kuhn, Computer Lab, Univ of Cambridge, GB
http://www.cl.cam.ac.uk/~mgk25/ | __oo_O..O_oo__



Re: What problems do leap seconds *really* create?

2003-01-30 Thread John Cowan
Tom Van Baak scripsit:

> I worked on System III UNIX at Bell Labs and the
> only support for time zones was the TZ environment
> variable - which only supported signed integer hour
> offsets.

Be glad, then, that the early adopters of Unix in Ozland were on the coasts.

> After working on a mainframe that displayed
> time only in fractional millihours,

Gyukhhh.

> how cool it was that UNIX used a double-register
> (32-bit) integer to encode the time of day, had a
> TZ variable, and displayed time as HH:MM:SS.
> And all this at 2400 baud instead of 300. It was
> very clever.

Much nicer than my first system, the DEC PDP-8, whose operating system
didn't even know the time, only the date.  The kernel stored the date
as a 5-bit day, 4-bit month, and 5-bit year, but the file system only
kept the 3 low order bits of the year, in order to make the whole thing
fit in a 12-bit word.  The epoch was 1970-01-01, by chance the same as Unix's.
Of course, 2001-12-31 was the End Of All Things.

> As far as I remember in the code, and in practice,
> time_t was just a cute way to encode a date;
> nothing more.

You didn't routinely subtract two time_t values to get elapsed time in seconds?

--
You escaped them by the will-death  John Cowan
and the Way of the Black Wheel. [EMAIL PROTECTED]
I could not.  --Great-Souled Samhttp://www.ccil.org/~cowan



Re: Telescope pointing and UTC

2003-01-30 Thread Steve Allen
On Thu 2003-01-30T02:36:24 -0800, Ken Pizzini hath writ:
>   Even if you
> assume that UTC will exist on past the life of the system, don't you
> expect that someday better DUT1 estimates will be available than the
> 0.1s signals available in WWV, and that future applications might
> find these better estimates to be useful?

The specifications for the automatic telescope call for an object to
appear within 10 arcsec of the field center after a slew.  This is
congruent with what the telescope engineers can do with the flexure
and hysteresis, but it obviously requires UT1 good to about 0.66 s for
targets on the equator.  Therefore we do need DUT1, but not to more
accuracy than it is provided.  Higher cost telescopes may be able to
demand tighter specifications.

> And this once again brings me back to my perennial question: wouldn't
> it be more useful to the astronomical community for time broadcasts to
> include an approximation for sidereal-time-at-the-prime-meridian than
> an approximation of UT1?

It is already being done.  The GPS constellation and the orientation
of the earth that it uses is expressed in an inertial reference frame.
By decree of the IAU, 30 days ago astronomers abolished the
three-millenium tradition of the Vernal Equinox in favor of an
inertial reference frame for all future astrometry.

--
Steve Allen  UCO/Lick Observatory   Santa Cruz, CA 95064
[EMAIL PROTECTED]  Voice: +1 831 459 3046 http://www.ucolick.org/~sla
PGP: 1024/E46978C5   F6 78 D1 10 62 94 8F 2E49 89 0E FE 26 B4 14 93



Re: What problems do leap seconds *really* create?

2003-01-30 Thread Tom Van Baak
> Fact 1 is that there is a *lot* of Unix code out there that depends on
> 1 day being represented by a 86400 increment in time_t, and that is what

This is correct. And add DOS, Windows, and NT.

> Fact 2 is that the old 1980s pre-POSIX Unix manuals talked about GMT and
> not UTC. This strongly suggests that the authors were unfamiliar with
> both TAI and UTC. The "seconds" they refer to behave more like UT1

This is correct.

> and not Caesium oscillations. There are in the long run as many UT1
> seconds as there are UTC-non-leap seconds. The authors of Unix also
> assumed that local civilian time zones and GMT differ only by an integer
> number of minutes (see struct timezone in man gettimeofday(2)). Today,
> most local civilian times and UTC have exactly that connection.

Your comment about the word GMT vs. UTC is
correct. I would add further that struct timezone
and gettimeofday are much later BSD additions
and were not part of any early versions of Bell
Labs UNIX. (You can tell because only the Berkeley
guys used a compiler that supported long names;
we were limited to 8 characters).

I worked on System III UNIX at Bell Labs and the
only support for time zones was the TZ environment
variable - which only supported signed integer hour
offsets.

This was all we needed. I worked on GCOS prior
to that and it ran on local time, even if you were
a dial-up timeshare user. UNIX was really cool
because each dial-up user could set his own time
zone. After working on a mainframe that displayed
time only in fractional millihours, I can't tell you
how cool it was that UNIX used a double-register
(32-bit) integer to encode the time of day, had a
TZ variable, and displayed time as HH:MM:SS.
And all this at 2400 baud instead of 300. It was
very clever.

As far as I remember in the code, and in practice,
time_t was just a cute way to encode a date;
nothing more. There was no more thought about
leap seconds on our PDP 11/34 than on my wrist
watch. After all, we usually turned the computers
off every night when we left work. The thought of
a computer, back then, being a continuous count
of sacred SI seconds is ridiculous. That's what
WWV was for. When the computer time was too
wrong, you just set it right. Remember also that
none of these computers had battery clocks so
you had to reset the clock every time you booted.
I remember how cool the first DOS PC I saw was
because it kept time across a boot. It was very
clever.

> Therefore we can conclude very convincingly that UTC non-leap seconds
> represent the best modern interpretation of what the authors of Unix had
> in mind. And POSIX.1-2001 wisely does specify now exactly that.
>
> I rest my case.
>
> Are there any more interesting problems with leap seconds than
> misinterpretations of old Unix manuals?
>
> Markus

So, I think I'm agreeing with you, Markus.

The point is, anyone that thinks 1970's UNIX was
some kind of sophisticated operating system with
a modern timescale algorithm just wasn't there.

/tvb
http://www.LeapSecond.com



Epsilon proposal vs integer UTC-TAI

2003-01-30 Thread Ed Davies
Mark Calabretta wrote:

> I note that, as yet, we have not heard a reasoned argument against my
> proposal for UTC to measure the true length of day in SI seconds.

Personally, I really like this proposal and wish that it had
been adopted instead of using leap seconds.  The main advantages
seem to me to be that "odd" case is available for testing every
day and that the adjustment is tiny and can be ignored in even
more cases than leap seconds can be ignored.  The only problem
I can see with it is the one Markus notes: time signals would
glitch when used as frequency standards over UTC midnight.

Also, admitting that days aren't 86'400 seconds long generalises
to other planetary bodies better than some of the bonkers ideas
around for Mars - the Red Mars example being mentioned earlier
in this discussion.

But, would it be practical to change now?  For example, is it not
likely that there are many existing systems which assume UTC-TAI
is an integer?

If I read the GPS SPS Signal Specification correctly then GPS
can handle non-integer offsets but I bet a lot of other systems
can't.



Trivial leap second "problem" example

2003-01-30 Thread Ed Davies
Markus Kuhn wrote:
>...
> Who needs to know about a leap second more than half a year in advance
> but has no access to a time signal broadcasting service (the better ones
> of which all carry leap second announcement information today)?
>...

Here's an example of a real but trivial nuisance, rather than an actual
problem as such, caused by leap seconds.

A company which I do work for from time to time manufactures logging
devices used primarily in gliding but also in other aviation and
sport activities.  The device is based around a Hitachi HD6303
processor (like a Motorola 6800) with RAM, a real-time clock, pressure
transducer (for altitude measurement) and serial port used for upload
and to receive input from a "domestic" GPS receiver.

Somewhere around 2000 of the loggers have been sold.  It originally
was created purely as a barograph (pressure altitude recorder) in
late 1980s then extended to record position information from a GPS
in 1993.

Every so-many seconds (user settable) the device records the pressure
altitude and listens for position and other information in NMEA 0183
format from the GPS.  If it gets a position it records it with the
altitude.  The basic timing of the trace recording is done using the
logger's own clock.

In order to "calibrate" the logger's clock, every half hour it also
records the UTC date and time as reported by the GPS receiver.

Of course, the UTC date/time output by the GPS receiver is delayed
slightly, by nearly a second with modern receivers and by nearly
two seconds for older receivers like the Garmin GPS-100 (as
verified by comparing the NMEA output with the display of a clock
driven by a broadcast time standard).  But so is the position
information and for gliding competition purposes the exact time of
a position is more likely to be important than the exact time of an
altitude measurement.  Also, the pressure altitude is recorded
before the logger starts listening for a position NMEA sentence so
tends to be, on average, half a second earlier than transmission
time of the corresponding position and therefore pretty close to
synchronous with it's validity time.

Later, when the trace is uploaded the differences between the GPS
derived UTC date/times and the logger's own date/times are calculated
and the average is taken across the whole trace.  This average is
then used to correct the times of all the samples to UTC.

If a leap second happened in the middle of the trace then it's effect
would be averaged out in this process - not too much of a loss.

As a double check, I also put in some code to report a warning if
the differences between logger clock time and GPS reported UTC varied
too much throughout a trace.  The question arises: how much is "too
much".  Well, early GPS receivers (Garmin GPS-100) only output
position information every two seconds so we had to allow for the
possibility of some date/time samples "catching" a slightly earlier
or later beat from the GPS.  Also the clock of the logger might drift
slightly - much less than one second in the length of a flight.  This
implies a tolerance of three seconds.  But if a leap second happened
there'd be another second to consider, too.  Hmm, anything else?
Dunno, better make it five seconds then.

In other words, consideration of leap seconds makes us reduce the
sensitivity of a check for other potential problems (like the logger
clock speed being significantly out or somebody trying to fiddle the
trace in some way).

You might ask, why not take leap seconds into account in doing the
calculations after upload?  Well, that would be possible but this
information is not readily available on the scruffy old DOS PC's
typically floating about in gliding club club-houses.  We could ask
the users to enter the information but it seems like an awful lot
of hassle for tiny gain.

More significantly, I haven't tested the system when a leap second
happens.  I did record the output of my Garmin 100 over a leap
second a few years ago just to see what happens and could try playing
that back into a logger but it doesn't follow that the exact timing
of the output from the GPS would be right.

In case anybody's interested, the GPS was reporting even or odd
numbered seconds before midnight then a few seconds after midnight it
switched to reporting the opposite.

At least with the epsilon proposal the unusual case would be
available once a day for testing - as well as having a tiny enough
effect to ignore compared with the inaccuracies of most clocks in
widespread use.

Ed Davies.



Long live UT1

2003-01-30 Thread Ed Davies
Ken Pizzini wrote:

> Right, but in its way UT1 is "king" because that is the measure of
> earth-position time which is used in the definition of our current
> time standard, UTC.

UT1 might be "king" but TAI is the "parliament" and this is a
consitutional monarchy.  I guess that makes UTC the government,
looking to everybody like it's in charge but actually kicked
back and forward between the other two.



Re: What problems do leap seconds *really* create?

2003-01-30 Thread John Cowan
Markus Kuhn scripsit:

> Fact 1 is that there is a *lot* of Unix code out there that depends on
> 1 day being represented by a 86400 increment in time_t, and that is what
> POSIX.1:2001 now explicitely requires the system to provide.

In short, Posix has blessed what was always bad practice (viz. manipulating
time_t directly).

> Fact 2 is that the old 1980s pre-POSIX Unix manuals talked about GMT and
> not UTC. This strongly suggests that the authors were unfamiliar with
> both TAI and UTC. The "seconds" they refer to behave more like UT1
> seconds than like TAI/SI seconds, i.e. they are Earth rotation angles
> and not Caesium oscillations.

Where do you see any reference in the old documentation to the rotation of
the Earth?  The authors of those man pages were engineers, and they knew
perfectly well what a second was and is (since 1967), and they certainly
knew the difference between measuring/counting and encoding.

> There are in the long run as many UT1
> seconds as there are UTC-non-leap seconds. The authors of Unix also
> assumed that local civilian time zones and GMT differ only by an integer
> number of minutes (see struct timezone in man gettimeofday(2)).

The second argument to gettimeofday(2) is a Berserkeley hack that doesn't
and never did belong in the kernel, and doesn't even do anything in Linux,
Solaris, or modern BSDs.  It was introduced in 4.2BSD and already dead
in NET/2.  All other kernels, rightly, know nothing of time zones.

> I rest my case.

So do I: and the rest is silence.

--
John Cowan  [EMAIL PROTECTED]
http://www.ccil.org/~cowan  http://www.reutershealth.com
Thor Heyerdahl recounts his attempt to prove Rudyard Kipling's theory
that the mongoose first came to India on a raft from Polynesia.
--blurb for _Rikki-Kon-Tiki-Tavi_



Re: Unix notion of "Seconds since the Epoch"

2003-01-30 Thread John Cowan
Markus Kuhn scripsit:

> It also provides the formula that defines the encoding of UTC into that
> integer, leaving no doubt about the exact semantics.

I agree that this is clear, and I continue to believe that it is a regrettable
and unjustified change from existing older practice, not a mere clarification
of it.

--
"May the hair on your toes never fall out!" John Cowan
--Thorin Oakenshield (to Bilbo) [EMAIL PROTECTED]



Re: What problems do leap seconds *really* create?

2003-01-30 Thread Markus Kuhn
John,

Fact 1 is that there is a *lot* of Unix code out there that depends on
1 day being represented by a 86400 increment in time_t, and that is what
POSIX.1:2001 now explicitely requires the system to provide. This excludes
in practice the option to make time_t an encoding of TAI+10s.

http://www.cl.cam.ac.uk/~mgk25/volatile/posix-secs-since-epoch-2001.pdf

Fact 2 is that the old 1980s pre-POSIX Unix manuals talked about GMT and
not UTC. This strongly suggests that the authors were unfamiliar with
both TAI and UTC. The "seconds" they refer to behave more like UT1
seconds than like TAI/SI seconds, i.e. they are Earth rotation angles
and not Caesium oscillations. There are in the long run as many UT1
seconds as there are UTC-non-leap seconds. The authors of Unix also
assumed that local civilian time zones and GMT differ only by an integer
number of minutes (see struct timezone in man gettimeofday(2)). Today,
most local civilian times and UTC have exactly that connection.
Therefore we can conclude very convincingly that UTC non-leap seconds
represent the best modern interpretation of what the authors of Unix had
in mind. And POSIX.1-2001 wisely does specify now exactly that.

I rest my case.

Are there any more interesting problems with leap seconds than
misinterpretations of old Unix manuals?

Markus

--
Markus Kuhn, Computer Lab, Univ of Cambridge, GB
http://www.cl.cam.ac.uk/~mgk25/ | __oo_O..O_oo__



Re: Unix notion of "Seconds since the Epoch"

2003-01-30 Thread Markus Kuhn
John Cowan wrote on 2003-01-30 16:35 UTC:
> Markus Kuhn scripsit:
>
> > As I noted earlier, de facto and "de jure" (meaning POSIX.1:1996,
> > section 2.2.2.113), any real world Unix file system (and that's where
> > the term "seconds since the epoch" comes from in this context) uses a
> > timestamp that counts "non-leap seconds since the epoch".
> >
> > http://www.cl.cam.ac.uk/~mgk25/volatile/posix-2-2-2-113.pdf
>
> That page shows a definition of the term "seconds since the Epoch", viz.
> "[a] value to be interpreted as the number of seconds between a specified
> time and the Epoch."

It also provides the formula that defines the encoding of UTC into that
integer, leaving no doubt about the exact semantics.

The 1996 version of POSIX has now been superceeded by the 2001 version,
and I have scanned in the equivalent page from IEEE Std 1003.1-2001 for
you on

  http://www.cl.cam.ac.uk/~mgk25/volatile/posix-secs-since-epoch-2001.pdf

This clarification was made after exactly the same discussion that we
have here took place within POSIX. The new definition now says (slightly
abbreviated)

  A value that approximates the number of seconds that have elapsed
  since the Epoch. A UTC name (specified in terms of seconds, minutes, etc.)
  is related to a time represented as seconds since the Epoch according to
  the expression below.

followed by the usual conversion formula, this time now even including
the full Gregorian leap year correction that was still missing in the
1996 version.

This reformulation of the standard should finally remove any doubt about
what exactly the term "seconds since the epoch" means under Unix and
makes clear that it is meant to encode a UTC clock value and not to
represent any exact number of physical SI seconds since the epoch.

There is now additional wording that now leaves explicitely the exact
relationship between the actual time of day and the "seconds since the
epoch" value undefined, in order to allow implementors to smooth out the
behaviour near leap seconds as in UTS or in BSD's adjtime(1). This
should help to convince people that the unfortunate
step-back-by-1-second implementation of leap seconds in the Linux NTP
kernel PLL is not required by POSIX.

Markus

--
Markus Kuhn, Computer Lab, Univ of Cambridge, GB
http://www.cl.cam.ac.uk/~mgk25/ | __oo_O..O_oo__



Re: What problems do leap seconds *really* create?

2003-01-30 Thread John Cowan
Markus Kuhn scripsit:

> > > They have always counted the non-leap seconds since 1970-01-01.
>
> > The Posix interpretation is only a few years old, and a break with Unix
> > history.  Before that, time_t ticked SI seconds since the epoch (i.e.
> > 1970-01-01:00:00:00 GMT = 1970-01-01:00:00:10 TAI).
>
> Sorry, you just make this up.

You know that that is not true.  (This is the lie direct in the
third degree, also known as the Countercheck Quarrelsome; see
http://www.bartleby.com/81/4209.html .  I would not go so far on this
list, had you not gone further already.)

> Unix machines ticked the seconds of their
> local oscillator from boot to shutdown. Local oscillator seconds differ
> from SI seconds by typically +/- 10^-5 s or worse. Unix time had
> multiple or fractional inserted and deleted leap seconds whenever the
> administrator brutally readjusted the local clock using the
> settimeofday(2) system call closer to UTC.

You are confusing "what you got" with "what you were supposed to get".  It's
true that timekeeping practices in those days were deplorable at both the
hardware and administrative levels.  However, the *intent* was to return
*the number of seconds since the Epoch*, as specified by the manual -- the
point of a man page entry is to document how the application or system call
is supposed to behave (except in the BUGS section).

The Posix interpretation was a *reinterpretation*.

> > The time(2) man
> > page in the Sixth Edition (unchanged in the Seventh) of Research
> > Unix says:
> >
> > .I Time
> > returns the time since 00:00:00 GMT, Jan. 1, 1970, measured
> > in seconds.
>
> Today we distinguish between "civilian" (UTC non-leap) seconds and
> physical (SI) seconds. The authors of that manual very obviously didn't
> make that distinction and you should not misrepresent them by claiming
> that they did.

I have every reason to think that when they spoke of seconds, they spoke
of SI seconds, and that when they spoke of measuring, they meant measuring,
not encoding.

> In practice, the source code shows that time_t values are converted to
> UTC clock displays without a leap second table, therefore they clearly
> are just meant to be an encoding of UTC clock display values and nothing
> else.

So now you think that implementation defines intent?  What is the use of
having standards at all, in that case?

--
Some people open all the Windows;   John Cowan
wise wives welcome the spring   [EMAIL PROTECTED]
by moving the Unix. http://www.reutershealth.com
  --ad for Unix Book Units (U.K.)   http://www.ccil.org/~cowan
(see http://cm.bell-labs.com/cm/cs/who/dmr/unix3image.gif)



Re: What problems do leap seconds *really* create?

2003-01-30 Thread John Cowan
Markus Kuhn scripsit:

> As I noted earlier, de facto and "de jure" (meaning POSIX.1:1996,
> section 2.2.2.113), any real world Unix file system (and that's where
> the term "seconds since the epoch" comes from in this context) uses a
> timestamp that counts "non-leap seconds since the epoch".
>
> http://www.cl.cam.ac.uk/~mgk25/volatile/posix-2-2-2-113.pdf

That page shows a definition of the term "seconds since the Epoch", viz.
"[a] value to be interpreted as the number of seconds between a specified
time and the Epoch."

Well, it may be "to be interpreted" as such a value, but it is *not* such
a value.  As of this writing, its value on my machine is 1043944376, but
that does not mean, or so it seems, that 1043944376 s have elapsed since
the Epoch.

--
De plichten van een docent zijn divers, John Cowan
die van het gehoor ook. [EMAIL PROTECTED]
  --Edsger Dijkstra http://www.ccil.org/~cowan



Re: What problems do leap seconds *really* create?

2003-01-30 Thread William Thompson
Ken Pizzini wrote:
>
> On Wed, Jan 29, 2003 at 07:04:23PM +, Markus Kuhn wrote:
> > Who really needs to maintain a full list of leap seconds and for what
> > application exactly?
>
> If a file storage system stores timestamps as "SI seconds since
> some epoch", and a legal question arises about whether a given
> document was stored on that system before or after midnight on
> some given date, the logic which converts the timestamp to a
> UTC date will need to know how many leap seconds to adjust for,
> which in the general case means that the logic converting these
> file-timestamps into UTC will need a full list of leap seconds.

In fact, that is exactly what we do in the software used to command the CDS
instrument aboard the SOHO spacecraft.  The user always works in UTC time, but
the times are stored internally as double precision floating point values
representing the number of TAI seconds since 1-Jan-1958.  The same procedure is
used in the catalog of observations taken.

William Thompson



Re: Calabretta's 86400 s + epsilon day proposal

2003-01-30 Thread Markus Kuhn
Mark Calabretta wrote on 2003-01-30 00:58 UTC:
> I note that, as yet, we have not heard a reasoned argument against my
> proposal for UTC to measure the true length of day in SI seconds.

If you make every UTC day 86400 + epsilon(date) days long, then life
gets more difficult for people who broadcast standard frequencies such
as 50. Hz TV sync signals, because now you can't simply say that
you start a new TV frame exactly at the start of a new second (with UTC
you can, even across leap second!). Note that the vast majority of
critical frequencies we have are an integer number of hertz. It is very
convenient to adjust these by triggering an oscilloscope from the 1
pulse-per-second output of an atomic clock or time signal receiver and
then observe the phase wander of your oscillator. Such adjustments would
break down if UTC clocks would start to have now 1 pulse-per-(something
that is an SI second most of the time).

The technical importance of pulse-per-second and integer-Hertz signals
was a strong reason for why we moved from smooth UTC to leap-second-UTC
in 1972, and your proposal would reverse that improvement. That's why I
don't like it.

Even if, a procedure would still have to be put into place to define the
function epsilon(date). How long in advance would IERS have to announce
this function?

If I understood you correctly, all you propose is to replace the full
leap second at the end of a day every few years by a couple of leap
microseconds at the end of the every UTC day, right? That sounds
slightly more complicated to me and doesn't fundamentally eliminate the
problem that there still will exist UTC time stamps after 23:59:59.999...

You could alternatively stretch the length of the UTC second at the end
of each day, and this way your proposal would be similar to UTS, except
that you make the correction every day whereas I prefer to limit it to
the vicinity of each current UTC leap second. But pulse-per-second
signals would experience a phase jump at the end of every day ... :-(

Markus

--
Markus Kuhn, Computer Lab, Univ of Cambridge, GB
http://www.cl.cam.ac.uk/~mgk25/ | __oo_O..O_oo__



Re: What problems do leap seconds *really* create?

2003-01-30 Thread Markus Kuhn
John Cowan wrote on 2003-01-30 13:01 UTC:
> Markus Kuhn scripsit:
>
> > Unix timestamps have always been meant to be an encoding of a
> > best-effort approximation of UTC.
>
> Unix is in fact older than UTC.

This is getting slightly off-topic, but Unix slowly evolved and was
reimplemented various times during the first half of the 1970s and the
early versions probably didn't have a clock. It didn't exist in practice
outside Bell Labs before 1976. Gory details are on:

  http://www.bell-labs.com/history/unix/firstport.html

> > They have always counted the non-leap seconds since 1970-01-01.

> The Posix interpretation is only a few years old, and a break with Unix
> history.  Before that, time_t ticked SI seconds since the epoch (i.e.
> 1970-01-01:00:00:00 GMT = 1970-01-01:00:00:10 TAI).

Sorry, you just make this up. Unix machines ticked the seconds of their
local oscillator from boot to shutdown. Local oscillator seconds differ
from SI seconds by typically +/- 10^-5 s or worse. Unix time had
multiple or fractional inserted and deleted leap seconds whenever the
administrator brutally readjusted the local clock using the
settimeofday(2) system call closer to UTC. Only much later in the 1980s
did the Berkeley Unix version add the adjtime(2) system call to allow
smooth manual adjustment towards UTC by changing the length of the Unix
second relative to the local oscillator second by IIRC up to 1%. The
entire question of the relation of Unix time to UTC and TAI came only up
at roughly the same time as POSIX in the late 1980s when people started
to get interested in time synchronization over LANs and (in Europe)
DCF77 radio receivers.

> The time(2) man
> page in the Sixth Edition (unchanged in the Seventh) of Research
> Unix says:
>
> .I Time
> returns the time since 00:00:00 GMT, Jan. 1, 1970, measured
> in seconds.

Today we distinguish between "civilian" (UTC non-leap) seconds and
physical (SI) seconds. The authors of that manual very obviously didn't
make that distinction and you should not misrepresent them by claiming
that they did.

> IOW, it is a count of elapsed time since a certain moment, measured in
> SI seconds, and not an encoding of anything.

In practice, the source code shows that time_t values are converted to
UTC clock displays without a leap second table, therefore they clearly
are just meant to be an encoding of UTC clock display values and nothing
else. Implementations that do anything else are purely experimental, not
widely used, and can cause serious disruption in practice.

> Even today, you can install the ADO (and probably GNU) packages in
> either of two ways:  "posix", in which there are no leap seconds and
> time_t's get the POSIX interpretation you reference; and "right", in
> which there are leap seconds and time_t is a count of seconds.
> Try setting your TZ to "right/" and see what you get.

The so-called "right" mode in Olson's timezone library on that makes
time_t an encoding of TAI+10s instead of UTC, as well as Dan Bernstein's
libtai are both commonly regarded to be experimental implementations and
not recommended for general use. I don't know anyone who uses TAI+10s on
Unix in practice and it violates various standards. The reasons for why
it shouldn't be used have been discussed in great detail on Olson's tz
mailing list. You have completely misunderstood Unix time if you think
that the Olson "right" configuration has anything to do with it.

Markus

--
Markus Kuhn, Computer Lab, Univ of Cambridge, GB
http://www.cl.cam.ac.uk/~mgk25/ | __oo_O..O_oo__



Re: What problems do leap seconds *really* create?

2003-01-30 Thread Markus Kuhn
Ken Pizzini wrote on 2003-01-30 06:06 UTC:
> On Wed, Jan 29, 2003 at 07:04:23PM +, Markus Kuhn wrote:
> > Who really needs to maintain a full list of leap seconds and for what
> > application exactly?
>
> If a file storage system stores timestamps as "SI seconds since
> some epoch",

As I noted earlier, de facto and "de jure" (meaning POSIX.1:1996,
section 2.2.2.113), any real world Unix file system (and that's where
the term "seconds since the epoch" comes from in this context) uses a
timestamp that counts "non-leap seconds since the epoch".

http://www.cl.cam.ac.uk/~mgk25/volatile/posix-2-2-2-113.pdf

Markus

--
Markus Kuhn, Computer Lab, Univ of Cambridge, GB
http://www.cl.cam.ac.uk/~mgk25/ | __oo_O..O_oo__



GLONASS leap second problems just a legend?

2003-01-30 Thread Markus Kuhn
Steve Allen wrote on 2003-01-29 22:19 UTC:
> Aside from GLONASS, those who wish to abolish leap seconds
> have not concreately identified the systems which don't like leaps.

Even for GLONASS, the alleged system-inherent leap-second problems seems
extremely badly documented and the entire story is likely to be just an
urban legend.

All I could find are a few second-hand non-technical reports/rumours
that suggest more that certain early GLONASS receivers might have had
software bugs that caused a loss of tracking for a few minutes after a
leap second. Well, such bugs can/ should get fixed. Or is there clear
evidence (a knowledgeable document with all the technical details) that
GLONASS actually *has* to take the broadcasting satellites offline
merely to do the leap second adjustement? Where? Sounds very odd and
unbelievable to me.

In fact GLONASS operational bulletins such as

051-970619
NOTICE ADVISORY TO GLONASS USERS (NAGU) 051-970619
SUBJ:UTC LEAP SECOND CORRECTION 1.07/0300 MT
1.CONDITION: ACCORDING TO IERS BULLETIN C-13 FROM 30.06 TO 01.07 AT  UTC
 THERE WILL BE LEAP SECOND CORRECTION OF ALL UTC TRANSFER
 FACILITIES INCLUDING GLONASS.
 TIME MARK SIGNALS WILL BE TRANSMITTED AS FOLLOWS:
 30.06.97/23 H 59 MIN 59 S (UTC)
 30.06.97/23 H 59 MIN 60 S (UTC)
 01.07.97/00 H 00 MIN 00 S (UTC)
2.POC:CSIC RSF AT +7-095-333-81-33

052-970619
NOTICE ADVISORY TO GLONASS USERS (NAGU) 052-970619
SUBJ:FORECAST OUTAGE FOR ALL GLONASS SPACECRAFT 01.07/0259-02.07/0259
1.CONDITION: ALL GLONASS SPACECRAFT ARE SCHEDULED TO BE UNUSABLE
 SINCE  01.07/0259 UNTIL 02.07/0259 MT (UTC+0300)DUE TO
 PLANNED GLONASS TIME CORRECTION (NOTE: THIS IS NOT
 CONNECTED WITH LEAP SECOND CORRECTION)
2.USERS ARE REMINDED TO DO NOT PLAN ANY OBSERVATIONS FOR THIS PERIOD
3.POC:CSIC RSF AT +7-095-333-81-33

on

  http://www.rssi.ru/SFCSIC/1997.htm

claim even explicitely that the announced satellite down-time for the
entire day 1997-06-31 23:59Z to 1997-07-01 23:59Z was not connected with
a leap second correction, even though it admittedly coincides with it.
Perhaps they just need to take the entire system offline from time to
time, and sometimes let these servicing periods coincide with leap
seconds? In fact there was no such announced downtime at the next leap
second at the end of 1998 !

Summary: The leap second problems of GLONASS sound to me very
much like an urban legend or a popular missunderstanding.

More on GLONASS:

  http://www.rssi.ru/SFCSIC/english.html

Markus

--
Markus Kuhn, Computer Lab, Univ of Cambridge, GB
http://www.cl.cam.ac.uk/~mgk25/ | __oo_O..O_oo__



Re: What problems do leap seconds *really* create?

2003-01-30 Thread John Cowan
Markus Kuhn scripsit:

> Unix timestamps have always been meant to be an encoding of a
> best-effort approximation of UTC.

Unix is in fact older than UTC.

> They have always counted the non-leap
> seconds since 1970-01-01.

The Posix interpretation is only a few years old, and a break with Unix
history.  Before that, time_t ticked SI seconds since the epoch (i.e.
1970-01-01:00:00:00 GMT = 1970-01-01:00:00:10 TAI).  The time(2) man
page in the Sixth Edition (unchanged in the Seventh) of Research
Unix says:

.I Time
returns the time since 00:00:00 GMT, Jan. 1, 1970, measured
in seconds.

IOW, it is a count of elapsed time since a certain moment, measured in
SI seconds, and not an encoding of anything.

Even today, you can install the ADO (and probably GNU) packages in
either of two ways:  "posix", in which there are no leap seconds and
time_t's get the POSIX interpretation you reference; and "right", in
which there are leap seconds and time_t is a count of seconds.
Try setting your TZ to "right/" and see what you get.

--
You escaped them by the will-death  John Cowan
and the Way of the Black Wheel. [EMAIL PROTECTED]
I could not.  --Great-Souled Samhttp://www.ccil.org/~cowan



Re: Leap seconds in the European 50.0 Hz power grid

2003-01-30 Thread Markus Kuhn
Steve Allen wrote on 2003-01-29 20:53 UTC:
> On Wed 2003-01-29T15:05:59 -0500, John Cowan hath writ:
> > I was a little too clipped.  If you know all the leap seconds, you can
> > convert a Unix-style timestamp to UTC reliably; if you further know all
> > the timezone changes, you can convert UTC to LCT reliably.
>
> I remain confused about why this "isolated system" cares whether it
> keeps time as UTC or TAI.  How does its time get set?  How does its
> time stay locked to SI seconds?
>
> Are you supposing that the system is able to keep SI seconds because
> it has some sort of unshielded PLL which is tracking the carrier
> signal from something like the US Navy's high powered VERDIN VLF
> transmissions for submarines?  (With their 50 baud message that
> basically says "We're still here so don't launch" and if your clock
> stops ticking, nothing really matters much anymore.)

VERDIN phase tracking is perhaps a somewhat pathological case.

Here is a more realistic source of standard frequency that can easily be
tracked and is in practice tracked in lots of low-cost consumer
electronics:

Most of the European continent (excluding Britain and some East European
Countries) runs a 50.0 Hz continent-wide phase-locked power grid known
as the UCPTE grid (Union for the Coordination of Transmission of
Electricity - the organization responsible for the reliable operation of
the interconnected electricity network in Europe). The UCPTE
specification says that the grid phase vectors have to rotate on
long-term average exactly 50 * 60 * 60 * 24 times per UTC day. That is
you get on average 50 * 86400. oscillations out of each power socket
in Europe every day, even if you consider days that end in a leap second
and are actually 86401 SI seconds long. Near an inserted leap second,
they are actually reducing the power grid frequency in a coordinated way
for a few minutes by up to 50 mHz in order to make sure that all the
many clocks that use this 50 Hz standard frequency as their time
reference remain in sync with UTC. You can observe this nicely with an
oscilloscope if you have a stable reference signal to triger it
independently. Power-grid coupled clocks will go 0.1% slower briefly all
over Europe to resync with UTC after a leap second. Note that the power
frequency deviates sometimes significantly from 50 Hz, but the PLL
controllers contain an integrator and eleminates any long-term error
relative to UTC this way.

References:

  - 
http://europa.eu.int/comm/energy/en/elec_single_market/florence9/position_paper/ucte/policy1.pdf
Section S.6.1.

  - http://www.verbund.at/at/apg/stromtransport/TOR%20E.pdf
Section 3.1.1 (5)

Another ubiquitously available high-quality reference frequency in
Germany are the national TV broadcast sync signals, which are derived
from caesium clocks in the basements of the broadcasting houses (at
least ZDF does this). Their TV signals are not frequency adjusted to
follow UTC, they stay at 50.00 Hz exactly. However this is not a
problem for consumer electronics, because the teletext data in the
vertical bank interval labels each TV frame with an ASCII encoded
hh:mm:ss timestamp that tightly follows UTC. In practice, TV sets with
radio controlled clocks simply evaluate the teletext time stamps when
the receiver is switched on and run freely when it is off. Same for
radio receivers that evaluate RDS time signals. They are not phase
locking clocks to the TV signal, and even if they would, they could
learn about the leap second from the teletext data (with a small delay
as teletext lacks a leap second announcement).

Markus

--
Markus Kuhn, Computer Lab, Univ of Cambridge, GB
http://www.cl.cam.ac.uk/~mgk25/ | __oo_O..O_oo__



Re: What problems do leap seconds *really* create?

2003-01-30 Thread Markus Kuhn
John Cowan wrote on 2003-01-29 20:05 UTC:
> I was a little too clipped.  If you know all the leap seconds, you can
> convert a Unix-style timestamp to UTC reliably;

Sorry, that's not correct. It's merely a common missunderstanding of the
definition of POSIX timestamps. There exists already a perfectly simple
algorithmic leap-second-table-free mapping between Unix-style timestamps
and UTC, specified formally in

  ISO/IEC 9945-1:1996, Section 2.2.2.113
  http://www.cl.cam.ac.uk/~mgk25/volatile/posix-2-2-2-113.pdf

Unix timestamps have always been meant to be an encoding of a
best-effort approximation of UTC. They have always counted the non-leap
seconds since 1970-01-01. The only minor problem is that the value
23:59:60 cannot be represented uniquely in the time_t encoding, but that
is in practice elegantly circumvented by changing the length of the Unix
second near a UTC leap second by less than a percent (UTC smoothing,
something which I suggest should be standardized formally for Unix-style
timestamps to improve interoperability of timestamps near leap seconds).
The older POSIX.1:1996 interpretation above could be quoted as implying
that time_t has to jump back during a leap second because the formula
provided leads to the same numeric value for 23:59:60 and 00:00:00 the
next day (unfortunately, that is still what the Linux NTP kernel support
does today). The POSIX.1:2001 revision softened the definition in order
to include the option of UTC smoothing into what it allows, making it
possible to use a more graceful leap second representation in time_t,
such as for example my UTS proposal.

Markus

--
Markus Kuhn, Computer Lab, Univ of Cambridge, GB
http://www.cl.cam.ac.uk/~mgk25/ | __oo_O..O_oo__



Re: What problems do leap seconds *really* create?

2003-01-30 Thread Steve Allen
On Thu 2003-01-30T00:28:57 -0800, Ken Pizzini hath writ:
> Right, but in its way UT1 is "king" because that is the measure of
> earth-position time which is used in the definition of our current
> time standard, UTC.

I would go so far as to argue that UT1 is not time, but angle.
UT1 does not measure what we now understand to be time, it measures
the spin of a nonrigid body whose atmosphere, oceans, mantle, core,
sun, and moon are all doing strange things impede uniformity.
Unfortunately we have a history which very strongly confuses the
distinction, not the least being because we still communicate UT1 with
units that make it look like time.  For technical communications it
might be a trend towards better thinking if the switch were made to
use a vocabulary which always gives UT1 in degrees.

>Some appear
> to feel that this history is important to preserve in our civil time
> standard ("UT1 rules!"  "UTC ain't broke"); others appear to feel
> that it is irrelevant ("Just use TAI, dammit!").

Do not confuse things.  I do not think that there is anyone on this
list, or anywhere, who would disagree with the current use and
definitions of UT1 and TAI.  Only civil time and UTC are at issue.

--
Steve Allen  UCO/Lick Observatory   Santa Cruz, CA 95064
[EMAIL PROTECTED]  Voice: +1 831 459 3046 http://www.ucolick.org/~sla
PGP: 1024/E46978C5   F6 78 D1 10 62 94 8F 2E49 89 0E FE 26 B4 14 93



Re: What problems do leap seconds *really* create?

2003-01-30 Thread Ken Pizzini
On Wed, Jan 29, 2003 at 03:43:02PM -0700, Rob Seaman wrote:
> Ken Pizzini says:
> > I realize that the astronomical community has evolved to a consensus
> > that UT1 (approximated by UTC) is a highly useful way to mark time,
>
> Rather we've evolved a consensus that different problems require
> different systems of time - not surprising, since we invented most
> of them.

Right, but in its way UT1 is "king" because that is the measure of
earth-position time which is used in the definition of our current
time standard, UTC.

> > with the additional feature that it is usable as a civil time standard,
>
> It isn't just usable - it is preferable to many alternatives.

That is what I was saying: of the useful earth-position based ways
to mark time, UT1 was felt to be among the best, if not the best,
choice when considerations of civil time were thrown into the mix.
Especially if one is convinced that the prime meridian shall remain
special (even if the land masses currently under it might happen to
suffer from tectonic drift over time).


> > but there is so much of that evolution which is based on historical
> > accident rather than purely technical requirements
>
> "Historical accident" makes it sound like the practice of timekeeping
> was some afterthought to events.

You mean to tell me that our systems of timekeeping were inspired
solely by dispassionate rational thought about what good designs for
timekeeping would be, without any historical baggage?  That our current
practice of breaking a day into 24 hours is inspired by something
other than the history of how sub-day measurement was handled by
pre-mechanical-clock cultures, or that the further splitting of hours
into 60 "minute" (read that as "my-newt") parts and that still further
into 60 secondary parts has nothing to do with the vagaries of history?

Maybe it is just my use of the term "historical accident" is not
a broadly used idiom: by it I mean that, especially in the context
of some variety of evolution, it is easy to think up turns in the
course of history, unrelated to any intrinsic aspect of an entity
under consideration, which would give that entity a significantly
different character than it currently has.  For example, the fact
that UTC's meridian passes through London, rather than through, say,
Paris or Washington, has to do with the politics of how GMT came
to be a global time-and-longitude standard, and not because of any
technically intrinsic merit of that meridian over the others.


>   The reality is that timekeeping has
> often been central to other, bloodier, battles - from Augustus Caesar
> appropriating an extra day from February into "his" month

If that is not a "historical accident" I don't know what is...
Our calendar, with its system of 31,28,31,30,31,31,30,31,30,31 day
months, and leap year day being added to the second month of that list,
has a full and rich history.  Some of that history relates to attempts
to improve the calendar dramatically (Julius Caesars reforms come to
mind), some relates to purely political tinkering (such as Augustus'
grab), and some relates to trying to make incremental improvements
without major reworking (such as the reform of leap year rules
introduced under pope Gregory).

The reformationists of the French Revolution attempted to make a
"more rational" calendar as part of their overall scheme which has
given us the metric system, but (for various reasons) that aspect of
their new way of measuring failed.  One lesson that has been suggested
from that is that a good design for a timekeeping standard _must_
consider historical and cultural factors.  That is no doubt why
everyone I've seen speak out in this forum tries to stick with units
of SI seconds, with different varieties of "days", and bandies about
86400 as a magic number, rather than going off on quixotic suggestions
of things like decimal clocks or reform of how our calendar marks the
relation between days and years.  (I'm sure that there are several
of us who would _like_ to see such dramatic reforms, but they are
both well outside any reasonable interpretation of what this list's
charter is and so far out of the mainstream of our culture that such
reforms are bound to fall flat.)

>  to Harrison's
> chronometer that was instrumental to the building of a later empire.

Ah, and this ties in to one of the most contentious points that I've
seen discussed on this list: historically, in the days before crystal
oscillators and atomic clocks, good determination of longitude has
been inextricably tied to the good determination of time.  Some appear
to feel that this history is important to preserve in our civil time
standard ("UT1 rules!"  "UTC ain't broke"); others appear to feel
that it is irrelevant ("Just use TAI, dammit!").


> Is there nobody on this list who was present at the birth of UTC?

Alas, I was not.  (Well, in the sense of "was born" I "was present",
but not in the sense of "participated in or observed the process,
or was awar