Nan,

GPS time didn't exist, as such, before 1980-01-06 (year-month-day). It is a continuous count from the reference date and time of 1980-01-06 00:00:00 using the Gregorian calendar and the UTC time system. UTC started using leap seconds in 1972. Before that UTC was being adjusted by small arbitrary amounts on much shorter time scales than the current "a couple of times every three years or so" scale. To refer to times prior to 1980-01-06 with the GPS time system, you would be using a "proleptic" version, and leap seconds would still matter.

I like the idea of recording the time resolution or uncertainty. I think this could help signal to data consumers that you took the trouble needed with leap seconds, etc to achieve a certain level of accuracy and precision.

Grace and peace,

Jim

On 6/29/15 11:56 AM, Nan Galbraith wrote:
Hi Tim -

I don't think there's any difference between GPS and UTC times
before 1981-07-01, when the first leap second was accepted (if that's
the right term), so, if your reference time is before that date, you
can use the GPS calendar attribute, and not worry about it being used
to muck with your reference date.

I've argued that 'instrument and measurement datasets' generally have
time values that can't be accurate enough to care about leap seconds,
and you've expressed the opposite view. To be honest, my instrument
clocks are mysterious, at best, and I often have to override them. Then,
to top things off, I use Matlab to generate elapsed time for CF, and have
no clue whether that software knows anything about leap seconds.

Maybe an uncertainty attribute should be a best practice for time variables; OceanSITES recommends doing that, but I'm not sure often it's included - or
noticed by data users.

Cheers - Nan

On 6/29/15 6:16 AM, Timothy Patterson wrote:
I understand that for climate or forecast data that 30+ seconds of inaccuracy may not be significant, but even though the C and F in "CF Conventions" stand for "Climate and Forecast", the conventions are also being increasingly adopted for instrument and measurement datasets. In these cases, time accuracy at small time scales becomes more important, which is why seeing a proposed convention that allows the time to be written ambiguously (so that there may or may not be discontinuities or offsets) is rather disconcerting, like telling a climate scientist that the netCDF encoding of his temperature data may or may not have introduced an inaccuracy of a few Kelvin in some readings.

Under the CF1.6 conventions, I believe the base time or epoch time was always expressed in UTC and the calendar attribute was applied to the encoded time count. Using this convention, I could specify a UTC start time and then have a simple GPS-like count of elapsed seconds (with no discontinuities introduced by leap seconds). Under the proposals for the 1.7 convention, this doesn't seem possible. The epoch and time count must both be expressed either as UTC or as GPS time and the only "mixed calendar" options introduce the above mentioned ambiguity.

Regards,

Tim Patterson



--
CICS-NC <http://www.cicsnc.org/> Visit us on
Facebook <http://www.facebook.com/cicsnc>         *Jim Biard*
*Research Scholar*
Cooperative Institute for Climate and Satellites NC <http://cicsnc.org/>
North Carolina State University <http://ncsu.edu/>
NOAA National Centers for Environmental Information <http://ncdc.noaa.gov/>
/formerly NOAA’s National Climatic Data Center/
151 Patton Ave, Asheville, NC 28801
e: [email protected] <mailto:[email protected]>
o: +1 828 271 4900

/Connect with us on Facebook for climate <https://www.facebook.com/NOAANCEIclimate> and ocean and geophysics <https://www.facebook.com/NOAANCEIoceangeo> information, and follow us on Twitter at @NOAANCEIclimate <https://twitter.com/NOAANCEIclimate> and @NOAANCEIocngeo <https://twitter.com/NOAANCEIocngeo>. /


_______________________________________________
CF-metadata mailing list
[email protected]
http://mailman.cgd.ucar.edu/mailman/listinfo/cf-metadata

Reply via email to