What I don't understand is why there's an insistence on keeping the rate of UTC identical to TAI and inserting leap seconds. Why not just define the UTC second to be the advancement of Earth's 0-longitude line by 15 arcseconds relative to the sun (in other words, make the Earth's rotation the reference clock for UTC), and have NTP supply UTC and TAI separately (so that systems that need either one have access to it)?
If Earth's accumulated rotation angle is ever non-monotonic, we'll have bigger and more thermally significant problems than timekeeping. On Fri, Nov 15, 2024, 10:30 Frantisek Rysanek via Freedos-user < freedos-user@lists.sourceforge.net> wrote: > > On Tue, 12 Nov 2024 at 15:53, tsiegel--- via Freedos-user > > > > > > To solve the whole time/date problem, I never understood why they > > > don't separate the two. Time could then be a regular integer, since > > > there's only 86,400 seconds in a day. > > > Hmm. I have to give this idea some credit, in the context of > terminology. A date is what consists of YYYY-MM-DD, so time can mean > the finer fractions. > > Other than that, there is no single encoding that will suit > everybody... No matter what you propose, you'll always find someone > for whom your chosen integer data type is too long, and someone for > whom the resolution is not enough - be it days, milliseconds or > nanoseconds. Therefore, inevitably, the math of comparing two > timestamps or adding an interval to a timestamp will sometimes entail > carry transfer operations... depending on circumstances. > > > Hint. There are 24 timezones, but not all are 1 hour different. > > Computers move. Sometimes they move fast. Computers fly aeroplanes and > > sail ships. They move across timezones and need to keep working. > > > > Go read this, and I mean all of it, closely. It is fun, it is > > interesting, and it will teach you stuff you NEED to know. > > > > https://gist.github.com/timvisee/fcda9bbdff88d45cc9061606b4b923ca > > > That list is hilarious - thanks Liam, didn't know that one :-) > > That said, I find it spooky to the casual reader. > Let me follow up on a more constructive note: > > There is the UTC, as a reasonably useful and reliable global > standard. A popular common denominator. > > Operating systems (definitely other than DOS), namely UNIX and > NT-based Windows, run in UTC on the inside. Local time zones are a > matter of user interfaces / presentation layer. A logged-in desktop > (or the machine as a whole?) has a "locale", including a timezone > name and offset, that can be changed without messing with the one > system clock under the hood running UTC. If you want your software to > be immune from local timezone shenanigans, use UTC internally for > your timekeeping. > > The one time sync protocol in the Internet, the Network Time > Protocol, also runs in UTC, per standard. Thus, the timekeeping > services in operating systems can talk to each other in the same time > domain in which they internally operate. So again UTC is a common > denominator, throughout the global internet, cross platform (speaking > of computer operating systems). > > UTC is still made interesting by leap seconds. In practical operating > systems, these are forever an element of fun. > I could go into detail but I do not want to. To most people and > systems, leap seconds are a random nuissance that hopefully does not > cause much damage. I could quote examples to the contrary, but I > don't think this is the right place :-) > > Apart from UTC, there are other time "domains", such as GPS time and > TAI. These two run without leap seconds. GPS time is shifted by a > couple seconds vs. the TAI. Only UTC has leap seconds - when a leap > second occurs, the offset of UTC vs. TAI and GPS time gets > incremented (or decremented). > > In this context, UTC is the "convenient human wall time". > > TAI is the domain in which the official reference atomic clocks > really tick, united and organized by a body called BIPM. > TAI is also used for industrial applications - such as, the Precision > Time Protocol uses TAI by default. > The advantage of TAI is exactly that every minute is always 60 > seconds long and there are no "leaps" (anomalies) to that rule. > > The occasional decision to introduce a leap second is based on > science, but also has a significant political angle. Cannot be > algorithmized - it just has to be absorbed by the system when it > happens. Having a "non-leaping" machine-oriented time domain like the > TAI makes it easy to implement or consider UTC merely as an integer > offset off the TAI (similar to the local timezone offsets). > > Inside operating systems, as a programmer, you can meet several > nominal "time domains" that differ in fine timing details: > > - there's the CLOCK_REALTIME which corresponds to the UTC. > The OS timebase keeps this clock long-term accurate by finely > adjusting the frequency of its ticking, by periodically reprogramming > some hardware divider or synthesizer (that internally runs along a > relatively imprecise bus-clock crystal as its reference frequency). > If a large offset is encountered, CLOCK_REALTIME can also jump = get > adjusted stepwise. > > - there's also the "CLOCK_MONOTONIC" which is useful as a "relative" > timing source that never jumps. It is possibly frequency-adjusted > along UTC. And there's a CLOCK_MONOTONIC_RAW that is closest to the > hardware timebase, in that it does not even get frequency-adjusted. > > Operating systems or standard libraties contain functions to convert > between inner representations of time that are close to some long > integer type (UNIX "time_t" epoch time, Windows FILETIME) and > human-oriented breakdown of the time information into YYYY-MM-DD > HH-mm-SS.fraction . > > I cannot deny that there are application needs or practical > situations in the life of a programmer, where anomalous reality needs > to be accomodated, or at least error-checked and reported... see the > link provided by Liam, the list looks exhaustive :-) > > Also, once you start tinkering with timekeeping, you will find out > that accuracy and precision and synchronicity always have a limit > (some minimal error/deviation/noise), that there are limits to how > accurate your TDEV measurements can get, you run into "phase locked > loop tuning" aspects (control theory), oscillator stability and > ageing etc. > Signal paths and logic gates feature transport delays that are fixed > but not always known. Even just the operation of "taking a timestamp" > in software entails processing latencies that are not necessarily > constant and symmetrical, between your software thread and the HW > clock whom you are asking... > > No need to despair. What sort and level of imperfections actually > bother you, always depends on the application or assignment at > hand... > > It is fun :-) > > Frank > > > > _______________________________________________ > Freedos-user mailing list > Freedos-user@lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/freedos-user >
_______________________________________________ Freedos-user mailing list Freedos-user@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/freedos-user