Re: interoperability

2006-01-09 Thread Clive D.W. Feather
Rob Seaman said:
 The question of delivering wall
 clock time is a trivial elaboration on first delivering common
 international business time.  (I'm trying on different terminology
 than civil time until I hit one that sticks.)

I don't accept that the concept exists. The international business
community still works - as far as I can tell - on the equivalent of Hello
Fred, what time is it there?.

 The event of migrating a time
 zone is a discontinuity just as with a leap second or leap hour.

So what? We go through such discontinuities twice a year in most years.
Some places more often. Read the TZ list archives.

 What matters is not when sunrise occurs, but rather that every day
 has one (and only one).

Something which isn't and hasn't been true in many places.

What time is sunrise in Tromso today?

What time was sunrise on 1994-12-31 in eastern Kiribati?

 If Denmark or Elbonia decides to use a timezone which is offset
 from stage one by 1h3m21s, then it still works,
 Again, what is it, precisely?

Life.

 (but people travelling abroad will probably vote differently in the
 next election)
 Exactly.  The pressures to maintain a common international vision of
 time will trump local variations.

That's not pressure to maintain a common international vision, but people
not wanting to fiddle with the minutes and seconds on their digital
watches.

 In a couple of hundred years, the Danish Parliament (or its
 successor in interest) will simply decide from -MM-DD HH:00,
 the Danish Civil time will use offsets -3h and -2h (instead of
 presently -1h/-2h) and the transition will happen on the switch
 from summertime to wintertime by _not_ adjusting the clock.
 The only way this differs from the leap hour proposal is that you are
 assuming that different localities can (or would) carry these
 adjustments out separately.

 Let's see - how does this work?

Just the way that it does right now.

 Under the current standard, 3600 small steps
 would have bled away the pressure.  Under the ITU notion, a leap hour
 would be needed.  A leap hour means moving UTC backward one hour (to
 let TAI pull ahead).  As I've said before, under the daylight saving
 analogy this is only naively a fall back event, it would be better
 to explicitly add a 25th hour.  But let's continue through to the
 logical conclusion of implementing this via fall back events (or
 the equivalent time zone shifting).

Except that time zone shifting means you don't affect the UTC sequence.

 A fall back event means that the clock (local, standard,
 international, whatever clock you want) first traverses an hour - and
 then traverses it again.

Right.

At present, there's a meridian corresponding to UTC that starts at
Greenwich, drifts back and forth with secular changes in the earth's
rotation and, when it approaches Cutty Sark or the Dome suddenly jumps. The
proposal is simply to have this jump abolished, so that the UTC meridian
starts drifting around the earth.

 Um.  How does one redefine the length of the day without changing the
 length of the second?  Answer:  by changing the number of seconds in
 the day.  I won't belabor the difficulty of selling the idea of
 having different hours of different lengths.

You mean just like now?

--
Clive D.W. Feather  | Work:  [EMAIL PROTECTED]   | Tel:+44 20 8495 6138
Internet Expert | Home:  [EMAIL PROTECTED]  | Fax:+44 870 051 9937
Demon Internet  | WWW: http://www.davros.org | Mobile: +44 7973 377646
Thus plc||


Re: interoperability

2006-01-09 Thread Steve Allen
On Mon 2006-01-09T08:20:40 +0100, Poul-Henning Kamp hath writ:
 beginning (SI seconds are constant length).

Yes, SI seconds are constant length, but the ghost of my general
relativity teacher prompts me to assert that my SI seconds are not
equal to your SI seconds because we are in different reference
frames.

The rate at which TAI ticks has been modified several times to meet
improved notions of whose SI second it should really try to match.
The current notion is that of a coordinate time scale at a depth in
the geopotential field which is close to mean sea level.

Such a coordinate time scale cannot be extended very far from the
surface of the earth without requiring some fascinating corrections
to the rates measured by different observers.

Tom Van Baak can show you how measuring this is now child's play.

Why should my lab use TAI when the proper time experienced by my
real-time control processes runs at a different, and continually
varying, rate?

The answer is the same as for UT defined by Newcomb's expression used
from 1901 through 1983 and implemented via astronomy: it is the most
practical uniform time scale that we all can agree upon.

For current purposes with stationary clocks the varying terms in the
rate differences are immeasurable.  In the limit of very precise lab
timekeeping eventually the question arises as to whether TAI really is
the most suitable scale for some applications.

(This has nothing to do with leap seconds, but does raise the question
of the limits at which it becomes much more difficult to agree on time.)

--
Steve Allen [EMAIL PROTECTED]WGS-84 (GPS)
UCO/Lick ObservatoryNatural Sciences II, Room 165Lat  +36.99858
University of CaliforniaVoice: +1 831 459 3046   Lng -122.06014
Santa Cruz, CA 95064http://www.ucolick.org/~sla/ Hgt +250 m


Re: interoperability

2006-01-09 Thread Clive D.W. Feather
Rob Seaman said:
 I have heard no response to my discussion of techniques for achieving
 synchronization - of the difference between naive fall back hours
 and 25 hour days.  But how in practice is it envisaged that a scheme
 for migrating time zones versus TAI would work, precisely?

In the short term, by modifying the UTC-LTC function by adding a secular
term to the periodic one. Thus at present the function in the UK is:

dayofyear in [Last Sunday in Mar .. Last Sunday in Oct] ? 3600 : 0.

This would change to:

(dayofyear in [Last Sunday in Mar .. Last Sunday in Oct] ? 3600 : 0) +
(year  2600 ? 0 : year  3100 ? 3600 : year  3500 ? 7200 : ...)

or whatever. Note that we already have similar levels of complexity in
dealing with the changing summer time dates, the British Standard Time
folly, BDST during the war, and so on.

Note also that the Olsen tz code handles all of this just fine.

 Note, for
 instance, that nothing short of redefining the second can avoid the
 quadratic acceleration between the stage one and stage two clocks.
 Time zones (and the prime meridian?) would race more-and-more rapidly
 around the globe.

At some point, probably around the time that we're seeing an hourly shift
every year, people are going to have to divorce second from day, or at
least re-negotiate the terms of engagement.

--
Clive D.W. Feather  | Work:  [EMAIL PROTECTED]   | Tel:+44 20 8495 6138
Internet Expert | Home:  [EMAIL PROTECTED]  | Fax:+44 870 051 9937
Demon Internet  | WWW: http://www.davros.org | Mobile: +44 7973 377646
Thus plc||


Re: interoperability

2006-01-09 Thread Poul-Henning Kamp
In message [EMAIL PROTECTED], Steve Allen writes:
On Mon 2006-01-09T08:20:40 +0100, Poul-Henning Kamp hath writ:
 beginning (SI seconds are constant length).

Yes, SI seconds are constant length, but the ghost of my general
relativity teacher prompts me to assert that my SI seconds are not
equal to your SI seconds because we are in different reference
frames.

(This has nothing to do with leap seconds, [...]

You are absolutely right there, so why even bring it up ?

--
Poul-Henning Kamp   | UNIX since Zilog Zeus 3.20
[EMAIL PROTECTED] | TCP/IP since RFC 956
FreeBSD committer   | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.


Re: interoperability

2006-01-09 Thread Rob Seaman

On Jan 9, 2006, at 12:06 AM, Poul-Henning Kamp wrote:


You yourself defined stage one as TAI with some constant offset
yourself, you can't change definition in the middle of the discussion.


I was attempting to describe your position.  In point of fact, I
agree with Tom Van Baak:


You cannot divide timekeeping, time dissemination, into neat stages.


What we can do, however, is layer our standards upon a coherent
vision of the requirements placed on timekeeping by the wide range of
activities engaged in by humanity.  Talking to other humans aside
from the 114 members of this list might be a good first step.


I've never been in favour of the leap-hour proposal as other than a
political instrument to be abandonned well before the clock strikes.


Just wanted to re-emphasize your position.  Considering that you and
I,  the polar extremes of this issue, both reject the notion of leap
hours, perhaps we can find something else to talk about?


Not adjusting the clock is less disruptive than doing so, no matter
which half of the year.


Won't repeat my arguments a third time.


They have 600 years to find a solution and an implementation date
for it.


Who is this they you're talking about?  We're discussing changing a
standard that will be in effect now, then, and all times in between.
Our descendants won't be appreciably smarter than we are, and they
won't have access to insights regarding fundamental public
timekeeping issues that we don't have.  If we cannot posit a solution
more creative than forget the whole thing (which I continue to
assert is not an option, outside of extremely dark post-apocalyptic
science fiction tales) then neither will they.

Rob


Re: interoperability

2006-01-09 Thread Rob Seaman

On Jan 9, 2006, at 12:23 AM, John Cowan wrote:


This is like the day is light and night is dark statement: there
is, at any given location, one and only one sunrise per (solar)
day, no matter what clocks say.


Communication prospers when people's clear meaning is not subjugated
to petty grammarians.

We are now - and have been - discussing timekeeping changes that call
into question the definition of a day.  Those of us who support
solar time are fundamentally asserting the primacy of the the
standard day over the standard second (for civil timekeeping
purposes).  Those of us who consider solar time to be a curious
anachronism, assert the the SI second over the concept of a day (for
civil timekeeping purposes).


As I've pointed out before, future times in legal documents are
defined as LCT for a particular place, since the future mapping
between LCT and any other time scale is not known.


At the risk of igniting a new round of stage two nonsense, consider
the implications of your statement.  Currently LCT (as you appear to
mean it) is standard time.  Daylight saving (under whatever name) is
merely an overlay on standard time.  Standard time has no jumps
(except for leap seconds).

Under your suggestion, LCT would include the jumps for daylight
saving time (if locally used) as well as the jumps to correct for the
cumulative effect of tidal slowing.  As I hope I have established,
these are fall back discontinuities that would result in the same
hour of LCT occurring twice.  Is this not perceived to be a problem?

Rob


Re: interoperability

2006-01-09 Thread Rob Seaman

On Jan 9, 2006, at 1:01 AM, Clive D.W. Feather wrote:


We go through such discontinuities twice a year in most years.


Only the uninteresting daylight saving jumps.  UTC remains without
discontinuities above the level of a leap second.  If UTC weren't
equivalent to what I call civil time, the ITU wouldn't be making a
fuss to change it.


Except that time zone shifting means you don't affect the UTC
sequence.


Only because you would redefine UTC to be equivalent to TAI.


The proposal is simply to have this jump abolished, so that the UTC
meridian starts drifting around the earth.


Glad to see somebody admit that this is one of the issues in play.
Perhaps we might now bring the cartographic community inside the
firewall and clue them into what is being proposed?  Note again that
the implications of this are not somehow to be embargoed for 600
years, but rather would apply immediately and at all times between.

Rob


Re: interoperability

2006-01-09 Thread Rob Seaman

On Jan 9, 2006, at 1:22 AM, Clive D.W. Feather wrote:


At some point, probably around the time that we're seeing an hourly
shift every year, people are going to have to divorce second from
day, or at least re-negotiate the terms of engagement.


By what magic do we believe the issues involved will become more
tractable at some point in the future?

How precisely does one divorce the definition of the day from that of
the second?  What is a clock if not a device to slice days into
seconds?  The fundamental problem is that the second is defined
against one underlying concept of time and the day against another.
As such, there are only three options:

   1) redefine the day
   2) redefine the second
   3) occasionally reset the clock

The only one of these that doesn't beg for a truly vast amount of use
case and requirements analysis is #3, the status quo.  I suspect most
of us would be happy to pursue the research needed by either or both
of the first two options.  How much more interesting than letting our
pasty complected cave dwelling descendants have all the fun!

Rob


Re: interoperability

2006-01-09 Thread John Cowan
Rob Seaman scripsit:

 This is like the day is light and night is dark statement: there
 is, at any given location, one and only one sunrise per (solar)
 day, no matter what clocks say.

 Communication prospers when people's clear meaning is not subjugated
 to petty grammarians.

My point was that your rhetorical flourishes have run away with you
on more than one occasion.

 We are now - and have been - discussing timekeeping changes that call
 into question the definition of a day.  Those of us who support
 solar time are fundamentally asserting the primacy of the the
 standard day over the standard second (for civil timekeeping
 purposes).  Those of us who consider solar time to be a curious
 anachronism, assert the the SI second over the concept of a day (for
 civil timekeeping purposes).

I agree with this assessment, more or less.

 As I've pointed out before, future times in legal documents are
 defined as LCT for a particular place, since the future mapping
 between LCT and any other time scale is not known.

 At the risk of igniting a new round of stage two nonsense, consider
 the implications of your statement.  Currently LCT (as you appear to
 mean it) is standard time.  Daylight saving (under whatever name) is
 merely an overlay on standard time.  Standard time has no jumps
 (except for leap seconds).

 Under your suggestion, LCT would include the jumps for daylight
 saving time (if locally used) as well as the jumps to correct for the
 cumulative effect of tidal slowing.  As I hope I have established,
 these are fall back discontinuities that would result in the same
 hour of LCT occurring twice.  Is this not perceived to be a problem?

Perhaps the problem here *is* merely semantic.  By LCT I mean legal
time, the time that de jure or de facto is observed in any given place
(New York time in New York, Podunk time in Podunk, and Squeedunk time
in Squeedunk).  That includes all periodic or secular changes.
And although periodic changes are far more common, secular changes
for reasons of public convenience are *far* from unknown.

I will try to say legal time from now on, though there are parts of
the world (Antarctica, the oceans) where there is no legal time
strictly speaking, and de facto time rules.

It *is* a problem that some instants of (TAI/UTC) time have the same
LCT labels in certain time zones.  But it's a problem that we already
deal with once a year.  TV stations, for example, normally broadcast
the same program twice in a row on Leapback Sunday, at least in the U.S.

--
John Cowan  [EMAIL PROTECTED]  www.ccil.org/~cowan  www.reutershealth.com
The penguin geeks is happy / As under the waves they lark
The closed-source geeks ain't happy / They sad cause they in the dark
But geeks in the dark is lucky / They in for a worser treat
One day when the Borg go belly-up / Guess who wind up on the street.


interoperability

2006-01-08 Thread Rob Seaman

On Jan 8, 2006, at 9:09 AM, Poul-Henning Kamp wrote:


Doing so would once and for all have to divorce earth orientation
from that unified time scale, leaving it to governments to align
civil time with daylight as they see fit (just like today).


Without further debating the meaning of civil time, consider the
implications of this two stage system.  The first stage conveys TAI
or something related to it by a constant offset.  The second stage at
any location (correct me if I misunderstand you) would be a secondary
clock disseminated at the direction of the local authorities.
Governments and technical users would subscribe to the first stage
clock.  Businesses and civilians would subscribe to the second stage
clock(s).  Correct so far?

For the sake of argument, let's discount the risks associated with
confusing one stage's clock with the other.  One imagines, however,
that there won't be fewer safety critical, time dependent systems in
the future.  We might, in fact, suspect that every party to this
conversation would both admit this and use it to argue for their own
position :-)

Those risks, however, represent only one issue falling under the
umbrella of interoperability.  It is one thing to say that any random
local government can choose their own clock statutes.  This is
certainly true, but in practice the future international community
will work together to reach joint decisions on evolving common clock
practices (as you say, just like today).

I won't belabor the many worldwide systems that must interoperate for
the benefit of all.  But these systems must interoperate not only
between themselves, but with natural phenomena.  Forgive me (or
don't), but I am skeptical that phenomena of interest in the future
will not continue to include the rising and setting of the sun.  (And
isn't claiming otherwise equivalent to saying that stage two is
unnecessary?)

The question is:  how precisely does this differ from the situation
now or in the past?  Whether by fiat or not, some common worldwide
stage two clock must exist.  And some mechanism must exist for
synchronizing (to some level of tolerance that we can continue to
debate) that clock to diurnal cycles.  It is this synchronization
that is ultimately of interest to us, not leap seconds, per se.

I have heard no response to my discussion of techniques for achieving
synchronization - of the difference between naive fall back hours
and 25 hour days.  But how in practice is it envisaged that a scheme
for migrating time zones versus TAI would work, precisely?  Note, for
instance, that nothing short of redefining the second can avoid the
quadratic acceleration between the stage one and stage two clocks.
Time zones (and the prime meridian?) would race more-and-more rapidly
around the globe.

Perhaps I've misunderstood, but this line of reasoning doesn't appear
to resolve anything.

Rob Seaman
National Optical Astronomy Observatory


Re: interoperability

2006-01-08 Thread Poul-Henning Kamp
In message [EMAIL PROTECTED], Rob Seaman writes:
On Jan 8, 2006, at 9:09 AM, Poul-Henning Kamp wrote:

 Doing so would once and for all have to divorce earth orientation
 from that unified time scale, leaving it to governments to align
 civil time with daylight as they see fit (just like today).

Without further debating the meaning of civil time, consider the
implications of this two stage system.  The first stage conveys TAI
or something related to it by a constant offset.

Yes, too bad about the offsets (GPS etc) but as long as they don't change
with short notice, they can be dealt with.

The second stage at
any location (correct me if I misunderstand you) would be a secondary
clock disseminated at the direction of the local authorities.

Yes, just like now.

The DCF77 transmitter for instance sends out German legal time
which means that if you want UTC from it, you need to know the UTC
offset for summer/winter in Germany.

Governments and technical users would subscribe to the first stage
clock.  Businesses and civilians would subscribe to the second stage
clock(s).  Correct so far?

Almost.

What you overlook here is that computers tend to trancend governmental
boundaries.

Sensibly designed operating systems keep time in the form of the
first stage clock, and at the representation layer, knowing all the
worlds governmental decisions about getting from 1st to second stage
applies the appropriate conversions.

Badly designed operating systems keep time in local time which makes
interchange of information a nightmare across timezones.

Windows have got it right now I belive, but it used to be that a
file created and transmitted from Denmark at the end of the business
day would be older than a file created at the start of business day
in California, despite a strict ordering of the events.

For the sake of argument, let's discount the risks associated with
confusing one stage's clock with the other.

That's actually the good thing about the constant offset, it should
make it much easier to see if timestamps mix things that shouldn't be.

I won't belabor the many worldwide systems that must interoperate for
the benefit of all.  But these systems must interoperate not only
between themselves, but with natural phenomena.

Sure, and you can timestamp then on either timescale, because there
is a 1 to 1 translation between the two timescales [1].

You mention sunrise and sunset.

Since the introduction of timezones, one of the things which were
given up was the concept that sunrise/sunset happened on the same
numerical time at any given lattitude.

Denmark spans only a few hundred kilometers from east to west (not
counting Greenland this time), yet sunrise and sunset varies about
30 minutes from one side to the other.

Most people get the sunrise/sunset numbers out of the Almanac from
the Copenhagen Universitys Observatory [2] which lists sun rise/set
times for the observatory in Copenhagen and prints a table of
approximate geographical adjustment factors.

So already today, sunrise  sunset can only be determined using
auxillary tables of correction factors, tables which could trivially
absorb the DUT correction in addition to the longtude corrections.

The question is:  how precisely does this differ from the situation
now or in the past?  Whether by fiat or not, some common worldwide
stage two clock must exist.

BZZZT wrong.

The definition we started out with is:

The second stage at any location (correct me if I misunderstand
you) would be a secondary clock disseminated at the direction
of the local authorities.

Conversion from stage two to stage one (and back) is perfect, so
if I measure a supernova in Denmark on Danish Civil Time, I can
mail you my observations and you can convert it first to stage 1
and then to your local stage 2 to compare with your own observation.
Or more likely, convert your own stage 2 to stage 1 and compare
in the scientific time domain.

If Denmark or Elbonia decides to use a timezone which is offset from
stage one by 1h3m21s, then it still works, (but people travelling
abroad will probably vote differently in the next election)

I have heard no response to my discussion of techniques for achieving
synchronization - of the difference between naive fall back hours
and 25 hour days.  But how in practice is it envisaged that a scheme
for migrating time zones versus TAI would work, precisely?

The same way all changes in timezone seems to be carried out: by
_not_ adjusting the clock when going to summer or winter time.

In a couple of hundred years, the Danish Parliament (or its successor
in interest) will simply decide from -MM-DD HH:00, the Danish
Civil time will use offsets -3h and -2h (instead of presently
-1h/-2h) and the transition will happen on the switch from summertime
to wintertime by _not_ adjusting the clock.

That's been done many times throughout the world already.

If you look in NPL's decription of the Rugby timegrams:

Re: interoperability

2006-01-08 Thread Tom Van Baak
 Without further debating the meaning of civil time, consider the
 implications of this two stage system.  The first stage conveys TAI
 or something related to it by a constant offset.  The second stage at
 any location (correct me if I misunderstand you) would be a secondary
 clock disseminated at the direction of the local authorities.
 Governments and technical users would subscribe to the first stage
 clock.  Businesses and civilians would subscribe to the second stage
 clock(s).  Correct so far?

I think this was a fair description of the timekeeping
world in the 1960s or even 1970s.

But in the last 10 or 20 years, with the explosion in
consumers of broadcast time and frequency services
such as WWVB, DCF, GPS, and NTP, vast numbers
of applications have direct access to the first stage;
which effectively removes the power of the middle man,
the government, the local authorities.

What was your human technical user of 1970 is now
the infrastructure of the cellular phone system or the
hardcoded algorithms of an operating system or home
appliance. The technical users of the 60s have coded
themselves into products of the 90s.

You cannot divide timekeeping, time dissemination,
into neat stages. In the 1960s if ten labs were told
to offset their phase or frequency it affected only a
handful of people or systems. Today when IERS
announces a leap second, millions of machines,
systems, and people are affected. Thankfully, most
of them handle it OK.

/tvb


Re: interoperability

2006-01-08 Thread Rob Seaman

On Jan 8, 2006, at 4:04 PM, Tom Van Baak wrote:


You cannot divide timekeeping, time dissemination, into neat stages.


Again.  My point is strengthened.  This being the case, a requirement
on one flavor of time transfers to others.  We will not solve the
problem of creeping complexity and interface violations by attempting
to legislate the physical world out of the equation.  Rather, it is
the common baseline of mean solar time that will save us from our own
follies.  Whether it is a real number or not, it has the benefit of
correspondence (now, and day after day, millennia after millennia) of
mattering to humanity.  I don't say it matters in critical detail for
every purpose under the sun, rather it matters in broad strokes for
many a purpose.

I've got nothing against TAI and other flavors of interval time, they
simply do not match the requirements for a common human oriented
baseline.  They are preferred for some technical purposes.  They are
most definitely not preferred in broad strokes over long periods of
time to the bulk of our customers.

The customer is always right.

Rob


Re: interoperability

2006-01-08 Thread Daniel R. Tobias
On 8 Jan 2006 at 15:04, Tom Van Baak wrote:

 You cannot divide timekeeping, time dissemination,
 into neat stages. In the 1960s if ten labs were told
 to offset their phase or frequency it affected only a
 handful of people or systems. Today when IERS
 announces a leap second, millions of machines,
 systems, and people are affected. Thankfully, most
 of them handle it OK.

Although, even now, the majority of consumer and business equipment
is not directly affected in any noticeable way; such machines usually
run on a local clock considerably less accurate than an atomic clock,
periodically re-synced (perhaps manually, perhaps automatically) to
an external time standard.  At each such re-syncing, the time may
need to be adjusted by a few seconds, or even a few minutes, due to
inaccuraccies in the local timepiece, so any leap second that may
have occurred since the last syncing will merely result in a 1-second
difference in the magnitude of this adjustment, not particularly
noticeable to the end users.  If some application (e.g., a database)
requires a timescale without discontinuities, the application might
need to be shut down for a few seconds to perform the time adjustment
(whether or not there is a leap second in the mix) in order to
prevent data corruption at the moment of the change.

--
== Dan ==
Dan's Mail Format Site: http://mailformat.dan.info/
Dan's Web Tips: http://webtips.dan.info/
Dan's Domain Site: http://domains.dan.info/


Re: interoperability

2006-01-08 Thread Tom Van Baak
  You cannot divide timekeeping, time dissemination,
  into neat stages. In the 1960s if ten labs were told
  to offset their phase or frequency it affected only a
  handful of people or systems. Today when IERS
  announces a leap second, millions of machines,
  systems, and people are affected. Thankfully, most
  of them handle it OK.

 Although, even now, the majority of consumer and business equipment
 is not directly affected in any noticeable way; such machines usually
 run on a local clock considerably less accurate than an atomic clock,
 periodically re-synced (perhaps manually, perhaps automatically) to
 an external time standard.  At each such re-syncing, the time may
 need to be adjusted by a few seconds, or even a few minutes, due to
 inaccuraccies in the local timepiece, so any leap second that may
 have occurred since the last syncing will merely result in a 1-second
 difference in the magnitude of this adjustment, not particularly

Correct. This works for timepieces which are less than
accurate to the second. And I believe this is why UTC
and leap seconds are still today the most practical and
accepted way to reconcile the unavoidable difference
between astronomical and atomic timescales.

The danger, though, is that in the 60s maybe ten systems
were affected by leap seconds. In the 80s maybe a
thousand. Today, the number of systems affected (or is
it infected?) with leap second awareness is in the millions.

I worry about this trend in the decades to come. I am
a fan of leap seconds as a weird and curious nuisance
but am not sure I like the idea that eventually my car,
traffic lights, airlines, television, and my thermostat will
have to be reliably tied to the IERS in order to function
properly.

Don't forget the quartz wristwatch is only 40 years old.
What if cesium wristwatches show up 10 years from
now. What if some killer app 40 years hence requires
100 ms or 1 ms time accuracy. Do we still want UTC
leap seconds when it will infect ten billion devices?

This is not an argument for change right now. But no
matter how you look at it the current scheme does not
scale well into the future; either a technological future
(way too many devices affected by unscheduled time
steps) or an astronomical future (way too many leap
seconds a year).

 noticeable to the end users.  If some application (e.g., a database)
 requires a timescale without discontinuities, the application might
 need to be shut down for a few seconds to perform the time adjustment
 (whether or not there is a leap second in the mix) in order to
 prevent data corruption at the moment of the change.

I would guess your total shutdown solution gets less
popular as time goes on. That's one reason why CDMA
cell phones, most operating systems, and GPS use a
TAI-like continuous timescale instead of UTC for their
underlying timescale.

/tvb


Re: interoperability

2006-01-08 Thread Rob Seaman

On Jan 8, 2006, at 6:41 PM, Tom Van Baak wrote:


am not sure I like the idea that eventually my car, traffic lights,
airlines, television, and my thermostat will have to be reliably
tied to the IERS in order to function properly.


This is a general issue with the increasingly tight coupling between
any number of networked systems.  Certainly not unique to
timekeeping.  On the other hand, rarely should any system (certainly
not a safety critical application) depend on a clock that must remain
tightly slaved to *any* external signal.  Consider what evolutionary
path technology would have to take such that significant numbers of
traffic lights and thermostats would ever require direct contact with
the IERS.  It's easy to speculate about pathological engineering
practices - A requires access to B, immediately and always, perfect
and inviable.  But real designs result from real requirements.


What if some killer app 40 years hence requires 100 ms or 1 ms time
accuracy.


Don't think accuracy is the word you want.  The short answer to
your question is that all the time wonks would celebrate their
newfound employability.


Do we still want UTC leap seconds when it will infect ten billion
devices?


Perhaps you meant affect?  :-)

What we want and what we need are two different things.  And as is
currently true, those devices aren't required to use UTC unless they
need UTC.


the current scheme does not scale well into the future;


No, it does not - but there are two caveats:

1) the current scheme (also known as an international standard) is
good for several hundred years

2) no other scheme scales better


either a technological future (way too many devices affected by
unscheduled time steps) or an astronomical future (way too many
leap seconds a year).


As I pointed out close to five years ago, the ultimate long term
remediation will likely involve redefining the length of the second:

   http://iraf.noao.edu/~seaman/leap

Nothing over the years of intervening discussions has given me cause
to change my opinion.

Rob


Re: interoperability

2006-01-08 Thread John Cowan
Rob Seaman scripsit:

 The question is:  how precisely does this differ from the situation
 now or in the past?  Whether by fiat or not, some common worldwide
 stage two clock must exist.

Again, no it doesn't need to exist.

We need a uniform time scale like TAI.  And we need local civil time
for all the 400-odd jurisdictions in the world today.  If other people
need other timescales (and they do), there's no reason that should
affect the two requirements above.

 But how in practice is it envisaged that a scheme
 for migrating time zones versus TAI would work, precisely?

Straightforwardly.  Each locality decides when and how to adjust both
its offset from TAI and its seasonal transition function (if any),
just as it does today.  What we abandon is a universal time tightly
synchronized to Earth rotation in favor of a universal time
independent of earth rotation plus 400+ local civil times roughly
synchronized to Earth rotation containing various glitches.

--
We pledge allegiance to the penguin John Cowan
and to the intellectual property regime [EMAIL PROTECTED]
for which he stands, one world underhttp://www.ccil.org/~cowan
Linux, with free music and open source  http://www.reutershealth.com
software for all.   --Julian Dibbell on Brazil, edited


Re: interoperability

2006-01-08 Thread Poul-Henning Kamp
In message [EMAIL PROTECTED], Rob Seaman writes:

 Sensibly designed operating systems keep time in the form of the
 first stage clock,

Perhaps.  We have no examples of this.  Stage one would be TAI.  As
we have just been reminded, TAI is not ready for prime time.

Stop.

You yourself defined stage one as TAI with some constant offset
yourself, you can't change definition in the middle of the discussion.

Stage one is something like GPS time or UTC with no further leap
seconds.

Today stage one is UTC with leapseconds and all POSIX systems use
that but fake the leapseconds.

 Badly designed operating systems keep time in local time which
 makes interchange of information a nightmare across timezones.

You are arguing apples and oranges.  These operating systems, in
effect, use stage three clocks.

No, you are confused.

Perhaps I miss your meaning here, too.  The event of migrating a time
zone is a discontinuity just as with a leap second or leap hour.

The discontinuity is not in the stage one timezone, but only in the
governmental offset which defines stage two relative to stage one.
We already have two of those discontinuitues a year most places and
people and computers can live with them.

People can live with them because they are big enough that you
don't forget about them (for long).  Computers can live with
them because they use the stage-one timescale for representation.

 Denmark spans only a few hundred kilometers from east to west (not
 counting Greenland this time), yet sunrise and sunset varies about
 30 minutes from one side to the other.

This is true.  It is irrelevant to the underlying international
clock.  These are simply constant (if position dependent) offsets.
Big wup.  I think this issue is confusing the discussion.

No, they are actually very relevant because they show that you can't
use a timescale as a vector component to locate extra terrestial
objects without taking your longitude into account.

daylight saving time is irrelevant.

No.

What matters is not when sunrise occurs, but rather that every day
has one (and only one).

DST is very relevant, as it is a much more feasible mechanism
for holding the sun high in the sky on the civil timescale.

 Conversion from stage two to stage one (and back) is perfect,

Don't believe a detailed enough proposal is on the table to either
define the meaning of perfect in this context, or determine if the
notion being discussed meets the requirements for being so regarded.

I don't belive in science ?

For any timestamp on the civil timescale any spot on earth, there
exist a mathematical formula which will convert that timestamp
to UTC and vice versa.

 If Denmark or Elbonia decides to use a timezone which is offset
 from stage one by 1h3m21s, then it still works,

Again, what is it, precisely?

Your own proposal.


stage one is atomic time (e.g., TAI)
stage two is international civil time (e.g., UTC)
stage three is local legal time (e.g., Mountain Standard Time)

No.

Now you try to change definitions under the discussion again.

In your first email you defined it thusly:

   Without further debating the meaning of civil time, consider the
   implications of this two stage system.  The first stage conveys TAI
   or something related to it by a constant offset.  The second stage at
   any location (correct me if I misunderstand you) would be a secondary
   clock disseminated at the direction of the local authorities.

In other words:

(stage_zero is TAI)
stage_one is TAI + constant
stage_two is stage_one + governmental adjustment.

 In a couple of hundred years, the Danish Parliament (or its
 successor in interest) will simply decide from -MM-DD HH:00,
 the Danish Civil time will use offsets -3h and -2h (instead of
 presently -1h/-2h) and the transition will happen on the switch
 from summertime to wintertime by _not_ adjusting the clock.

The only way this differs from the leap hour proposal is that you are
assuming that different localities can (or would) carry these
adjustments out separately.

I've never been in favour of the leap-hour proposal as other than
a political instrument to be abandonned well before the clock strikes.

And yes, I think you are likely to see far more governments fiddle
with their respective civil time than scientists fiddle with UTC
over the next 500 years, so I'm confortable leaving it to them.

A fall back event means that the clock (local, standard,
international, whatever clock you want) first traverses an hour - and
then traverses it again.

No, that's what happens every year when switch from summer time to
winter time.

When the need to reset the civil timescale that event does _not_
happen.

This doesn't work because we're on the
wrong side of the pendulum's arc.  The point being that you don't
need to *not* adjust the clock in the Autumn - you need to not adjust
the clock in the Spring.

Same argument:  Not adjusting the clock is 

Re: interoperability

2006-01-08 Thread John Cowan
Rob Seaman scripsit:

 Sure, and you can timestamp then on either timescale, because there
 is a 1 to 1 translation between the two timescales [1].

 Perhaps I miss your meaning here, too.  The event of migrating a time
 zone is a discontinuity just as with a leap second or leap hour.

Sure.  But discontinuities in LCTs are something we already know how to handle.

 This is true.  It is irrelevant to the underlying international
 clock.

PHK and I are denying any need for an international clock that tracks
Earth rotation.

 What matters is not when sunrise occurs, but rather that every day
 has one (and only one).

This is like the day is light and night is dark statement: there is,
at any given location, one and only one sunrise per (solar) day,
no matter what clocks say.

 Exactly.  The pressures to maintain a common international vision of
 time will trump local variations.  It is the resulting common
 international  time clock that you won't let me refer to as civil
 time.  All requirements placed on UTC flow backwards from here.  You
 can't just edit UTC (or GMT) out of the debate.

What common international vision of time?  There is no common international
LCT.

stage one is atomic time (e.g., TAI)
stage two is international civil time (e.g., UTC)
stage three is local legal time (e.g., Mountain Standard Time)

What we are looking for is to redefine stage three directly in terms of
stage one without regard to a factitious stage two.

 In a couple of hundred years, the Danish Parliament (or its
 successor in interest) will simply decide from -MM-DD HH:00,
 the Danish Civil time will use offsets -3h and -2h (instead of
 presently -1h/-2h) and the transition will happen on the switch
 from summertime to wintertime by _not_ adjusting the clock.

 The only way this differs from the leap hour proposal is that you are
 assuming that different localities can (or would) carry these
 adjustments out separately.

Exactly!  That is what the principle of subsidiarity demands, and it is
a situation we already know how to handle.

 A fall back event means that the clock (local, standard,
 international, whatever clock you want) first traverses an hour - and
 then traverses it again.  Under the current three stage system it is
 only the most local stage three clocks that are affected.  You are,
 in effect, promoting this discontinuity to stage two - to the
 worldwide business timescale.  More to the point, you have said that
 stage one can be mapped back-and-forth to stage two.  But we've just
 shown that this is no longer a one-to-one mapping since the hour is
 traversed twice, corresponding to two hours of TAI duration.

You've redefined stage two in the course of this discussion.  Before it
meant LCT, now it means UTC.  But be that as it may.

Since we (PHK and I) are in favor of abolishing stage two, we are not
promoting the discontinuity from stage three to stage two.  Rather, we are
interested in allowing the various local authorities to introduce changes
into their stage three clocks *as they decide* to deal with any perceived
problems.

The true leap hour folks, if any, are actually doing what you say we are
doing: creating a large discontinuity in stage two.  The fake leap hour
folks, if any, are actually doing what we want, but are cynically saying
there will be a leap hour in stage two while not expecting such a thing
to ever happen.

 Ah!  But you've suggested that the other half of the annual daylight
 saving pendulum be used.  This doesn't work because we're on the
 wrong side of the pendulum's arc.  The point being that you don't
 need to *not* adjust the clock in the Autumn - you need to not adjust
 the clock in the Spring.  It is the springtime gap in the mapping
 (also not a very desirable feature for a time scale) that is omitted
 during one of these events - not the harvest-time doubly traversed hour.

Fair enough.

 (We'll omit discussion of the fact that not all localities observe
 daylight saving time in the first place.)

By all means.  (This is the rhetorical figure of *praeteritio*.)

 This is the same point I was trying to make about the 25 hour day.
 No historian or lawyer is going to look favorably on a situation that
 results in ambiguous timestamps.  Perhaps, you say, such timestamps
 should all be kept in TAI.  But in that case, we are back to the
 original question of why a stage two clock is needed at all.  By
 asserting stage two is needed, all the rest logically follows.

And we assert that stage two is *not* needed.  In any case, most of the
world's population deals with ambiguous timestamps every year.

As I've pointed out before, future times in legal documents are defined
as LCT for a particular place, since the future mapping between LCT and any
other time scale is not known.  This turns out not to be a big problem,
except for the makers of calendar programs.

--
John Cowan  http://www.ccil.org/~cowan  [EMAIL PROTECTED]
Be yourself.  

Re: interoperability

2006-01-08 Thread John Cowan
Poul-Henning Kamp scripsit:

 Windows have got it right now I belive, but it used to be that a
 file created and transmitted from Denmark at the end of the business
 day would be older than a file created at the start of business day
 in California, despite a strict ordering of the events.

It's still true in the sense that the hardware clock is assumed to run
in LCT on Windows, and therefore discovering UTC depends on a correctly
set TZ variable.  It's false in the sense that Windows now supports TZ
correctly.

 Sure, and you can timestamp then on either timescale, because there
 is a 1 to 1 translation between the two timescales [1].

I think it's confusing to call it 1 to 1, except in the sense that
LCT seconds are the same length as UTC/TAI seconds.  There are many
LCT timestamps that correspond to more than one UTC timestamp.
This can be kludged around by adding a bit (the isdst field in a struct time)
to say whether a LCT timestamp is the first or the second instance.

 The scheme you propose is eminently workable, and more or less exactly
 what we advocate.  I'm happy that you now see the merits of it.

Nope, he still doesn't.

--
On the Semantic Web, it's too hard to prove John Cowan[EMAIL PROTECTED]
you're not a dog.  --Bill de hOra   http://www.ccil.org/~cowan


Re: interoperability

2006-01-08 Thread Poul-Henning Kamp
In message [EMAIL PROTECTED], Rob Seaman writes:


As I pointed out close to five years ago, the ultimate long term
remediation will likely involve redefining the length of the second:

Rob,

I think this shows how little you understand of the entire thing.

Several SI units are defined relative to the second these days and
therefore everybody involved in metrology have had nothing but
contempt for the notion of changing the second length.

To cut this part of the topic out in cardboard for you:

1. The Earths rotation and to a lesser degree its orbital motion
   are lousy timekeeping devices, many orders of magnitude worse
   than the best atomic frequency normals.

2. In metrology you use the best available method to implement a
   fundamental unit.


But there is something else which bugs me.

Throughout all of these interminable discussions it has become
clear to me that you argue backwards from the end (there must
be a UTC with leapseconds) rather than forward from the
beginning (SI seconds are constant lengt).

In our most recent little exchange, you started out proposing a two
(or three) timescale solution without leap seconds, and then when
I showed that it worked out just the way we wanted, you started
to redefine the timescales so that one of them had to be UTC with
leapseconds.

You also keep harping about how day and night will switch places
without leapseconds, while at the same time dismissing the
governmentally defined local timezones as irrelevant, despite the
fact that they do the heavy lifting (four orders of magnitude more
than leapseconds) of holding the sun high in the sky at noon.

In other words, you are not arguing in good fait and behave
more like a religious zealot than anything else.

That is deeply unserious behaviour of a scientist Rob.

Poul-Henning

--
Poul-Henning Kamp   | UNIX since Zilog Zeus 3.20
[EMAIL PROTECTED] | TCP/IP since RFC 956
FreeBSD committer   | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.


Re: interoperability

2006-01-08 Thread Rob Seaman

On Jan 9, 2006, at 12:03 AM, John Cowan wrote:


Each locality decides when and how to adjust both its offset from
TAI and its seasonal transition function (if any), just as it does
today.


Not just as today, see intervening messages.


What we abandon is a universal time tightly synchronized to Earth
rotation in favor of a universal time independent of earth rotation
plus 400+ local civil times


Perhaps some neutral party would like to officiate?  Three questions:

   1) Could this ever possibly work?  (Please point out where my
earlier dissection fails.)

   2) For the sake of argument, imagine it could work.  Would it be an
improvement?

   3) It suffers the same quadratic meltdown.  Why change?


Re: interoperability

2006-01-08 Thread M. Warner Losh
In message: [EMAIL PROTECTED]
John Cowan [EMAIL PROTECTED] writes:
:  But how in practice is it envisaged that a scheme
:  for migrating time zones versus TAI would work, precisely?
:
: Straightforwardly.  Each locality decides when and how to adjust both
: its offset from TAI and its seasonal transition function (if any),
: just as it does today.  What we abandon is a universal time tightly
: synchronized to Earth rotation in favor of a universal time
: independent of earth rotation plus 400+ local civil times roughly
: synchronized to Earth rotation containing various glitches.

No matter what we do with leapseconds, there are still all those time
zones.

The problem with stopping leap seconds altogether is that the legal
definitions of time, although quite varied, are all about the same as
UTC as it exists today.  They are close enough that most countries
have adopted UTC bureaucratically rather than legislatively.  The
official time for the US, as published by the folks at NIST, is UTC.
The US law says mean solar time, as determined by the Secretary of
Commerce, who has delegated it to the Time and Frequency division of
NIST, who in turn use UTC.  NIST could easily use a different schedule
for leap second insertion (it could have inserted the leap second in
civilian time at the end of any day it wanted to and still maintained
the mean solar time legal requirement).  However, since UTC is a
recognized, international standard, the US went along and did its leap
second according to that standard.  This is a explicit choice that
someone, somewhere had to make, even though it is arguably the best
choice to make (wouldn't want to be the odd man out in civil time,
think of the impact on business).

The combination of UTC approximating the legal time is so man nations,
as well as the need for international consensus among lots of parties
with divergent views for any changes to the current system is why
we'll likely not see significant changes any time soon.  The best we
can hope for is that something will be done to change their
unpredictable nature given that we have good forcasting tools at our
disposal.

Warner


Re: interoperability

2006-01-08 Thread Peter Bunclark
On Sun, 8 Jan 2006, Tom Van Baak wrote:

 between astronomical and atomic timescales.

Could we rephrase that  between geophysical and atomic timescales ?
Astronomers measure it and have to compensate for it, not cause it.

Reminds me bitterly of the widely reported loss of Mars Climate Orbiter
being due to a confusion of metric and *english* units, like it was our
fault.

Pete.