predicting leap seconds (was Re: [LEAPSECS] Where the responsibility lies)

2006-01-07 Thread Neal McBurnett
On Wed, Jan 04, 2006 at 07:36:17AM +0100, Poul-Henning Kamp wrote:
> In message <[EMAIL PROTECTED]>, Neal McBurnett writes:
> >On Tue, Jan 03, 2006 at 08:32:08PM +0100, Poul-Henning Kamp wrote:
> >> If we can increase the tolerance to 10sec, IERS can give us the
> >> leapseconds with 20 years notice and only the minority of computers
> >> that survive longer than that would need to update the factory
> >> installed table of leapseconds.
> >
> >Do you have any evidence for this assertion?
>
> It is an educated guess.
>
> The IERS have already indicated that they belive they could do prediction
> under the 0.9 second tolerance with two or three year horizon.

The Torino Colloquium had some discussion of this.

 Proceedings of the Colloquium on the UTC Timescale held by
 ITU-R SRG 7A
 http://www.ien.it/luc/cesio/itu/ITU.shtml

 Prediction of Universal Time and LOD Variation
 D. Gambis and C. Bizouard, (IERS)
 http://www.ien.it/luc/cesio/itu/gambis.pdf

After a bunch of nice graphs (not all of which were easy to interpret)
I found the periodogram (essentially a discrete Fourier transform of
the input data) interesting.  The way I read it (expert advice
welcomed), the broad peaks at 26 years (0.6 ms/d) and 52 years (0.3
ms/d) suggest that the most common pattern is a gradual cycle a few
decades long of lengthening and shortening of the day, presumably
driven by movements in the earth's mantle and core.

Page 14 of the pdf has a table:

 Skill of the UT1 prediction statistics over 1963-2003

Horizon   Prediction accuracy in ms
3 years   308
2 years   163
1 year 68
180 days   36
90 days21
30 days 7
10 days 3

Perhaps these are worst cases?  It would be nice to have confidence
intervals.

They presented these conclusions:

 Possibility to predict UT1 with a 1s accuracy at least over 4 years
 using a simple method : seasonal, bias and drift.

 New prediction methods are under investigation (Singular Spectrum
 Analysis, neural network,..)

 Possibility to use Core Angular Momentum prediction for decadal
 modeling

Steve Allen wrote:
> http://www.ucolick.org/~sla/leapsecs/McCarthy.html
>
> This deserves discussion and analysis and explanation.

I wrote Dennis McCarthy about that, and he said he'd look up the
details and get back to me next week.  But he did remind me of this,
which I remember seeing in data they published via ftp years ago:

> Regarding the accuracy of these long-term predictions, the IERS
> Rapid Service and Prediction Center located at the U. S. Naval
> Observatory does make predictions of Delta-T in the IERS Annual
> Report.  The algorithm for those predictions was determined
> empirically by testing a wide range of possibilities.  It is
> essentially an auto-regressive model using the past ten years of
> data.  The accuracy based on comparison of observations with what
> would have been predicted using that model is shown in the table
> below.  Note that the accuracy estimates are 1-sigma estimates and
> that excursions of 2-sigma (or more) may not be unexpected.
>
> +-+
> |Year in the Future|Accuracy (1s) (seconds|
> |--+--|
> |1 | .04  |
> |--+--|
> |2 | .08  |
> |--+--|
> |3 |  .3  |
> |--+--|
> |4 |  .8  |
> |--+--|
> |5 |  1.  |
> |--+--|
> |6 |  2.  |
> |--+--|
> |7 |  2.  |
> |--+--|
> |8 |  3.  |
> |--+--|
> |9 |  4.  |
> +-+

The http://www.iers.org/ points eventually to

 http://141.74.1.36/MainDisp.csl?pid=47-25786

but the links from there to the annual reports seem broken right now.

I still haven't seen any good data on predictions for periods of
longer than 9 years.

Neal McBurnett http://bcn.boulder.co.us/~neal/


Re: Where the responsibility lies

2006-01-03 Thread Steve Allen
On Wed 2006-01-04T07:36:17 +0100, Poul-Henning Kamp hath writ:
> In message <[EMAIL PROTECTED]>, Neal McBurnett writes:
> >Do you have any evidence for this assertion?
>
> It is an educated guess.
>
> The IERS have already indicated that they belive they could do prediction
> under the 0.9 second tolerance with two or three year horizon.
>
> >Anyone have a prediction algorithm in mind, and a result of running it
> >on the last several decades or centuries of data?
>
> Makes a great subject for science, doesn't it ?

Yes, it does, but the scientist who did the calculation has only
barely explained the meaning of the work.

http://www.ucolick.org/~sla/leapsecs/McCarthy.html

This deserves discussion and analysis and explanation.

--
Steve Allen <[EMAIL PROTECTED]>WGS-84 (GPS)
UCO/Lick ObservatoryNatural Sciences II, Room 165Lat  +36.99858
University of CaliforniaVoice: +1 831 459 3046   Lng -122.06014
Santa Cruz, CA 95064http://www.ucolick.org/~sla/ Hgt +250 m


Re: Where the responsibility lies

2006-01-03 Thread Poul-Henning Kamp
In message <[EMAIL PROTECTED]>, Neal McBurnett writes:
>On Tue, Jan 03, 2006 at 08:32:08PM +0100, Poul-Henning Kamp wrote:
>> If we can increase the tolerance to 10sec, IERS can give us the
>> leapseconds with 20 years notice and only the minority of computers
>> that survive longer than that would need to update the factory
>> installed table of leapseconds.
>
>Do you have any evidence for this assertion?

It is an educated guess.

The IERS have already indicated that they belive they could do prediction
under the 0.9 second tolerance with two or three year horizon.

>Anyone have a prediction algorithm in mind, and a result of running it
>on the last several decades or centuries of data?

Makes a great subject for science, doesn't it ?

Poul-Henning

--
Poul-Henning Kamp   | UNIX since Zilog Zeus 3.20
[EMAIL PROTECTED] | TCP/IP since RFC 956
FreeBSD committer   | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.


Re: Longer leap second notice, was: Where the responsibility lies

2006-01-03 Thread Warner Losh
> I continue to find the focus on general purpose computing
> infrastructure to be unpersuasive.  If we can convince hardware and
> software vendors to pay enough attention to timing requirements to
> implement such a strategy, we can convince them to implement a more
> complete time handling infrastructure.  This seems like the real goal
> - one worthy of a concerted effort.  Instead of trying to escape from
> the entanglements of this particular system requirement, why don't we
> focus on satisfying it in a forthright fashion?

As someone who has fought the battles, I can tell you that a simple
table is 10x or 100x easier to implement than dealing with parsing the
data from N streams.  Sure, it limits the lifetime of the device, but
a 20 year limit is very reasonable.

I had one system that worked over the leap second correctly, even
though the code to parse the data from this specific brand of GPS
receiver hadn't been written yet.  It worked because it knew about the
leap second in a table that we'd included on our flash as a fallback
when we didn't know anything else.  If we could have a table for the
next 20 years, there'd be no need to even write the code to get from
the GPS stream :-).

I know you aren't pursuaded by such arguements.  I find your
dismissive attitude towards software professionals that have
implemented a complete leap second handling infrastructure, with
pluggable sources for leap second rather annoying :-(

Warner


Re: Where the responsibility lies

2006-01-03 Thread Neal McBurnett
On Tue, Jan 03, 2006 at 08:32:08PM +0100, Poul-Henning Kamp wrote:
> If we can increase the tolerance to 10sec, IERS can give us the
> leapseconds with 20 years notice and only the minority of computers
> that survive longer than that would need to update the factory
> installed table of leapseconds.

Do you have any evidence for this assertion?  It seems to me that if
IERS had presented a table in 1980, based on the conventional wisdom
that the earth is continuing to slow down over time, we'd have been
at the edge of that 10-second window in 2000.  And who knows how far
off a 1995 prediction would be in 2015, or what decadal fluctuations
have in store for us in the future.

 http://hpiers.obspm.fr/eop-pc/earthor/utc/leapsecond.html

Anyone have a prediction algorithm in mind, and a result of running it
on the last several decades or centuries of data?

Neal McBurnett http://bcn.boulder.co.us/~neal/


Re: Longer leap second notice, was: Where the responsibility lies

2006-01-03 Thread Rob Seaman

On Jan 3, 2006, at 4:22 PM, Poul-Henning Kamp wrote:


In message <[EMAIL PROTECTED]>, Ed Davies writes:

Poul-Henning Kamp wrote:

If we can increase the tolerance to 10sec, IERS can give us the
leapseconds with 20 years notice and only the minority of computers
that survive longer than that would need to update the factory
installed table of leapseconds.


PHK can reply for himself here but, for the record, I think RS's
reading of what he said is different from mine.  My assumption is
that PHK is discussing the idea that leaps should be scheduled many
years in advance.  They should continue to be single second leaps -
just many more would be in the schedule pipeline at any given
point.

Obviously, the leap seconds would be scheduled on the best available
estimates but as we don't know the future rotation of the Earth this
would necessarily increase the tolerance.  In theory DUT1 would be
unbounded (as it sort of is already) but PHK is assuming that there'd
be some practical likely upper bound such as 10 seconds.

Am I right in this reading?


yes.


I'm willing to entertain any suggestion that preserves mean solar
time as the basis of civil time.  One could view this notion as a
specific scheduling algorithm for leap seconds.  My own ancient
proposal (http://iraf.noao.edu/~seaman/leap) was for a tweak to the
current algorithm that would minimize the excursions between UTC and
UT1.  This suggestion is more than a tweak, of course, since it would
require increasing the 0.9s limit.  One could imagine variations,
however, with sliding predictive windows to balance the maximum
excursion against the look ahead time.  One is skeptical of any
advantage to be realized over the current simple leap second policy.

I continue to find the focus on general purpose computing
infrastructure to be unpersuasive.  If we can convince hardware and
software vendors to pay enough attention to timing requirements to
implement such a strategy, we can convince them to implement a more
complete time handling infrastructure.  This seems like the real goal
- one worthy of a concerted effort.  Instead of trying to escape from
the entanglements of this particular system requirement, why don't we
focus on satisfying it in a forthright fashion?

There is also the - slight - issue that we aren't only worried about
"computers".  There is a heck of a lot of interesting infrastructure
that should be included in the decision making envelope.

In general, the strategy you describe could also be addressed as an
elaboration on the waveform we are attempting to model with our
clocks.  Not a constant cadence like tick-tick-tick-tick, that is,
but tick-tick-tock-tick.  I do think there might be some interesting
hay to be made by generalizing our definition of a clock to include
quasi-periodic phenomena more complicated than a once-per-second
delta function.  Would give us some reason to explore the Fourier
domain if nothing else.

Rob Seaman
National Optical Astronomy Observatory


Re: Longer leap second notice, was: Where the responsibility lies

2006-01-03 Thread Poul-Henning Kamp
In message <[EMAIL PROTECTED]>, Ed Davies writes:
>Poul-Henning Kamp wrote:
>>> If we can increase the tolerance to 10sec, IERS can give us the
>>> leapseconds with 20 years notice and only the minority of computers
>>> that survive longer than that would need to update the factory
>>> installed table of leapseconds.
>
>PHK can reply for himself here but, for the record, I think RS's
>reading of what he said is different from mine.  My assumption is
>that PHK is discussing the idea that leaps should be scheduled many
>years in advance.  They should continue to be single second leaps -
>just many more would be in the schedule pipeline at any given
>point.
>
>Obviously, the leap seconds would be scheduled on the best available
>estimates but as we don't know the future rotation of the Earth this
>would necessarily increase the tolerance.  In theory DUT1 would be
>unbounded (as it sort of is already) but PHK is assuming that there'd
>be some practical likely upper bound such as 10 seconds.
>
>Am I right in this reading?

yes.

--
Poul-Henning Kamp   | UNIX since Zilog Zeus 3.20
[EMAIL PROTECTED] | TCP/IP since RFC 956
FreeBSD committer   | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.


Re: Longer leap second notice, was: Where the responsibility lies

2006-01-03 Thread Ed Davies

Poul-Henning Kamp wrote:

If we can increase the tolerance to 10sec, IERS can give us the
leapseconds with 20 years notice and only the minority of computers
that survive longer than that would need to update the factory
installed table of leapseconds.


Rob Seaman replied:

No.  Rather all computers that exist during such an event are
obligated to deal with it.  The number of deployed systems follows
some increasing trend similar to Moore's law.  By delaying the
adjustments, you guarantee that more systems will be affected when
they do occur.  And, unless you can guarantee that a particular
deployed system (and systems derived through various upgrade
pathways) will be retired prior to the adopted horizon, prudent
policy would require remediation in any event.

Would like to see a proposed architecture a little more detailed than
a "factory installed table".


PHK can reply for himself here but, for the record, I think RS's
reading of what he said is different from mine.  My assumption is
that PHK is discussing the idea that leaps should be scheduled many
years in advance.  They should continue to be single second leaps -
just many more would be in the schedule pipeline at any given
point.

Obviously, the leap seconds would be scheduled on the best available
estimates but as we don't know the future rotation of the Earth this
would necessarily increase the tolerance.  In theory DUT1 would be
unbounded (as it sort of is already) but PHK is assuming that there'd
be some practical likely upper bound such as 10 seconds.

Am I right in this reading?

Ed.


Re: Where the responsibility lies

2006-01-03 Thread Rob Seaman

All right - I guess we can go another round or two while waiting -
perhaps indefinitely - for reports of leap second related
catastrophes to filter in.

First, an apology for posting my previous reply publicly.  It escaped
my notice that I was replying to a private message.

On Jan 3, 2006, at 12:32 PM, Poul-Henning Kamp wrote:


If you already have to cope with DUT1 anyway, how much difference
can it possibly make if the definition says
 |DUT1| < .9
or
 |DUT1| < 10sec
or
 |DUT1| < 1 hour


The amplitude of the effect is 3600 times as great.  The interval
between each event much longer (though quadratic, not proportionate,
of course).  The larger amplitude implies a larger impact on human
activities.  The longer interval implies that future
instrumentalities will be less prepared to deal with the impact.


If we can increase the tolerance to 10sec, IERS can give us the
leapseconds with 20 years notice and only the minority of computers
that survive longer than that would need to update the factory
installed table of leapseconds.


No.  Rather all computers that exist during such an event are
obligated to deal with it.  The number of deployed systems follows
some increasing trend similar to Moore's law.  By delaying the
adjustments, you guarantee that more systems will be affected when
they do occur.  And, unless you can guarantee that a particular
deployed system (and systems derived through various upgrade
pathways) will be retired prior to the adopted horizon, prudent
policy would require remediation in any event.

Would like to see a proposed architecture a little more detailed than
a "factory installed table".


As far as I know, less than 1% of people on this planet actually
have the sun straight south at 12:00 local time today and a quite
sizeable minority (a lot of China) lives perfectly happy with the
sun being further than 15 degrees from south at local 12:00.


Once again confusing secular and periodic effects.  Leap seconds are
the result of a monotonic trend.  Time zones and daylight saving
repeat seasonally or represent geographic constants.


It follows from this, that a proposal for a 1hour tolerance on
DUT1 is perfectly feasible without odd things happening to the
cows milk etc.


Got me.  Might it not be prudent to ask a farmer?  We interviewed a
guy last year who networks dairy cattle via instrumented capsules
that monitor pH and communicate wirelessly from cow to cow back to
the barn.  Farmers are likely among the more technologically literate
community for temporal data products.  Why not benefit from their
expertise?


I think the leap hours is a political tool to make the proposal go
through without commiting anybody to anything for the next couple
of hundred of years.


Of course.  We differ on whether it is:  A) ethical, or B) viable.


There are three orders of magnitude difference between a leap second
and a leap hour, and consequently the need for leap hours will grow
less rapidly than the need for leapseconds.


Bzzz!  Nice try, but an incorrect answer.  Thanks for playing our
home game.

I've appended Steve Allen's excellent plot of the long term behavior
of length of day.  LOD grows linearly over scales pertinent to this
discussion (discounting wiggles due to interesting geophysics).  This
is due, of course, to angular momentum transfer from the Earth's
rotation to the Moon's orbit.  Mitigating factors include the rebound
of the continents since the last ice age.

Leap seconds accumulate as the integral under this curve (thus
quadratically).  The last hour's worth took 1200 years.  (Hence the
600 year estimate for issuing the first leap hour to limit excursions
to +/- 0.5 hours.)  The hour before that, only four centuries.  It
does not matter whether these are binned as leap seconds or leap
hours, they will arrive when they arrive.  That being the case, one
argues that bleeding off the accumulation in smaller, more frequent
doses is a better choice.

Rob Seaman
National Optical Astronomy Observatory
---


ancient.pdf
Description: Adobe PDF document


Re: Where the responsibility lies

2006-01-03 Thread Poul-Henning Kamp
In message <[EMAIL PROTECTED]>, Rob Seaman writes:
>John Hawkinson replies:
>
>I think PHK has demonstrated the ability (and willingness :-) to hold
>up his own end of an argument.  Should we ever find ourselves at the
>same conference, I'll buy him a beer in anticipation of a rousing
>discussion.

I'll be happy to bring booze as well :-)

>There are several issues confounded here.  First, an untested
>assertion that eliminating leap seconds will simplify time handling.
>DUT1 looms large in astronomical software and one would have to be
>convinced that this is not an issue with other disciplines.

But it's exactly the fact that DUT1 already exists that bugs me.

If you already have to cope with DUT1 anyway, how much difference
can it possibly make if the definition says
 |DUT1| < .9
or
 |DUT1| < 10sec
or
 |DUT1| < 1 hour

If we can increase the tolerance to 10sec, IERS can give us the
leapseconds with 20 years notice and only the minority of computers
that survive longer than that would need to update the factory
installed table of leapseconds.

>Third, leap seconds are a mechanism to realize mean solar time in
>practice.

As would leap-hours (or jumping timezones) be.  It's only a matter of
the tolerance we accept on DUT1.

As far as I know, less than 1% of people on this planet actually
have the sun straight south at 12:00 local time today and a quite
sizeable minority (a lot of China) lives perfectly happy with the
sun being further than 15 degrees from south at local 12:00.

It follows from this, that a proposal for a 1hour tolerance on
DUT1 is perfectly feasible without odd things happening to the
cows milk etc.

>The acknowledgment of a contingent need for
>leap hours shows that the authors of the ITU proposal understand this.

I think the leap hours is a political tool to make the proposal go
through without commiting anybody to anything for the next couple
of hundred of years.

>Fourth, the need for leap seconds is growing quadratically as the
>Earth continues to slow.  We have no business making ad hoc policies
>based on the rarity of events that are becoming more frequent.  The
>need for "leap hours" will grow just as rapidly - and much more
>dramatically.  A solution that ignores real world constraints is no
>solution.

Uhm, no.

There are three orders of magnitude difference between a leap second
and a leap hour, and consequently the need for leap hours will grow
less rapidly than the need for leapseconds.

>In short, leap hours are - well - dumb.  A proposal that relies on
>their use, naive.

Leap hours or leap seconds is only a matter of magnitude and
frequency and consequently both are equally naïve.

Poul-Henning

--
Poul-Henning Kamp   | UNIX since Zilog Zeus 3.20
[EMAIL PROTECTED] | TCP/IP since RFC 956
FreeBSD committer   | BSD since 4.3-tahoe
Never attribute to malice what can adequately be explained by incompetence.


Re: Where the responsibility lies

2006-01-03 Thread Rob Seaman

John Hawkinson replies:


Time handling bugs typically appear in the interfaces between
systems that make contradictory assumptions.



I think phk's point ("text book example") was that these problems
were more likely to have been detected in a world where everyone's
default timescale (UTC) was not subject to leap seconds -- rare
events that may not come up in the course of normal testing unless
special care is given.


I think PHK has demonstrated the ability (and willingness :-) to hold
up his own end of an argument.  Should we ever find ourselves at the
same conference, I'll buy him a beer in anticipation of a rousing
discussion.

There are several issues confounded here.  First, an untested
assertion that eliminating leap seconds will simplify time handling.
DUT1 looms large in astronomical software and one would have to be
convinced that this is not an issue with other disciplines.

Second, that UTC is indeed "everyone's default".  It is (rather
loosely) the current civil time standard, but I fail to see why this
makes it a default choice for a precision timing application.  "Civil
measurement" in the USA still revolves around twelve inches to a
foot, but SI units are the default for American scientists as is true
elsewhere in the world.  I won't belabor my assertion that mean solar
time is the obvious civil time standard.  But why do we expect that
an X-ray spacecraft would of necessity adopt such a standard?

Third, leap seconds are a mechanism to realize mean solar time in
practice.  The underlying issues are those of solar time, not how
this is achieved.  One can assert that leap seconds should cease, but
the fundamental civil timing requirements will remain in force.  Some
other mechanism for synchronizing our clocks to our home planet must
needs be identified.  The acknowledgment of a contingent need for
leap hours shows that the authors of the ITU proposal understand this.

Fourth, the need for leap seconds is growing quadratically as the
Earth continues to slow.  We have no business making ad hoc policies
based on the rarity of events that are becoming more frequent.  The
need for "leap hours" will grow just as rapidly - and much more
dramatically.  A solution that ignores real world constraints is no
solution.

Fifth, normal testing should involve special care - or what is its
value?

In short, leap hours are - well - dumb.  A proposal that relies on
their use, naive.

Rob Seaman
National Optical Astronomy Observatory


Where the responsibility lies (was Re: [LEAPSECS] text book example...)

2006-01-03 Thread Rob Seaman

On Jan 1, 2006, at 3:29 AM, Poul-Henning Kamp wrote:


http://lheawww.gsfc.nasa.gov/users/ebisawa/ASCAATTITUDE/


This describes a system for "attitude determination", i.e., for
pointing the ASCA X-ray telescope at celestial objects.  It appears
there were several bugs in time handling.  They get full marks for
diagnosing, documenting, and publicizing the resulting issues.  These
include issues of data characterization as well as of spacecraft
operations.

As someone already pointed out, the fundamental issue was the
adoption of UTC as a standard when TAI, for instance, might have been
more appropriate.  UTC conveys Earth orientation information - one
might wonder why it would be the obvious choice for a platform not
located on Earth.  The blind adoption of any standard is a bad
thing.  Adopting American standards instead of metric, or worse yet,
adopting both, can be fatal to a project.  Give them an inch and
they'll take a metre.

Posit for the moment that this project had completely avoided the use
of UTC.  Since the mission required maintaining attitude with respect
to the celestial sphere and solar system objects, requirements for
handling DUT1 (or the equivalent) would have remained.  Having
implemented an astrometric software package (others here are much
more experienced), I can assure you that such software is no less
subject to the creation of obscure bugs than any other system.  The
bugs that were realized as reflections of poor leap second handling
might well have instead appeared in other ways.  UTC is a handy tool
for conveying both UT1 and TAI in one concise package.  Depending on
the project, various assumptions and approximations may be
appropriate.  It sounds like inappropriate assumptions were made.  Oh
well.

Time handling bugs typically appear in the interfaces between systems
that make contradictory assumptions.  The quoted page's description
of two bugs that originally cancelled out later reasserting
themselves with a maintenance update is classic for such things.
Fixing one bug reveals others that were previously masked.

Bugs don't result from standards, they are the responsibility of
engineers.  The existence of a bug certainly isn't an argument for
ignoring underlying facts of nature.

The ASCA mission ended in 2000 as the result of a geomagnetic storm
(another pesky fact of nature keyed to Earth orientation) and the
spacecraft burned up on reentry in 2001.  (Earth orientation is
certainly a matter of public interest as to where de-orbiting
spacecraft might impact.)  However, we have yet to hear of any
significant issues encountered during the 2005 leap second by current
(appropriate or inappropriate) users of UTC.

Rob Seaman
National Optical Astronomy Observatory