@JimBiardCics wrote:
>  I'm not sure how to reframe my argument. I am specifically excluding 
> modelers from the topic. It is only relevant for real-world time values.

I think the way to re-frame it is to start from use-cases -- and, indeed, the 
docs should reflect this as well when we've settled on something. So I'll try:

1) modelers writing output from their models which are generally done (in the 
model) as time-delta since a_timestamp can use the current "gregorian" 
calendar, which has always been ambiguous with regard to leap-seconds -- but it 
works perfectly well for this use case.

so modelers need nothing new.

2) Folks collecting data with an instrument (Or some such) that start with a 
bunch of "correct" UTC datetimestamps (with leap seconds applied) need a way to 
encode this data so that it can be accurately described an recovered. Possible 
solutions:

 a) Have a proper UTC gregorian calendar -- to be used correctly:
   - the datetimestamp is correct UTC, leap seconds and all
   - the timedeltas have been computed with leap-seconds included
   - when decoding leap-seconds need to be taken into account in order to get 
accurate datetimestamps back.

b) Have a calendar that:
   - The datetimestamp is correct UTC
   - The timedeltas have been computed from datetimestamps without  leap-seconds
   - When decoding, leap-seconds should NOT be used in order to recover the 
original, correct timestamps.

c) Have a way to directly store datetimestamps in CF (probably ISO 8601 strings)
   - this would add a new way to encode time to CF, which we are all resistant 
to.

a) Is the "right" way to do it - so I think we should add this to CF. However, 
the reality is that most folks (producers or consumers) don't have easy access 
to a leap-seconds aware tools. So, at least for the moment, hardly anyone can 
actually use a proper UTC calendar. 

c) is another "right" way to do it -- so I'm advocating for that. However, 
other folks may not agree, so it may never happen. And (c) is easy to do with 
commonly available tools. However there are a lot of existing datasets and 
workflows that do (b) already, so it would be nice to provide a way for these 
folks to clearly define their data so it can be used correctly without needing 
to change their workflow.

So we need (b).

Downsides to (b):
 - the pendant in me thinks it's "just wrong", and we really shouldn't codify 
such things in a standard.
 - the pragmatist in me thinks that converting to-from datetimestamps that have 
leap seconds to-from timedeltas with a non-leapseconds aware library works MOST 
of the time -- with a very high value of MOST -- but it does not work correctly 
EVERY time. -- so codifying this workflow is inviting hidden (exceedingly rare) 
bugs and confusion.
 - the timedeltas actually stored are not "metrical" -- that is, they don't 
accurately reflect the amount of passed time (being off by some number of leap 
seconds)
 - folks that really care about sub-second-level accuracy should be using TAI 
or GPS time anyway

But "practicality beats purity" -- folks are doing this now, folks will 
continue to do it in the future, and it does work almost all the time.

Final plea -- maybe because I'm a pedant, but I still don't "like" this 
approach -- I think we should add a way to directly encode datetimestamps, and 
then recommend that it be used for this use-case.

 @JimBiardCics: does this correctly capture your proposal?

@martinjuckes: does this clarify it for you?  





 

-- 
You are receiving this because you commented.
Reply to this email directly or view it on GitHub:
https://github.com/cf-convention/cf-conventions/issues/148#issuecomment-436722293

Reply via email to