You don't mention the option of allowing time.microseconds to be a float, and I was curious about that since if it did work, then that might be a relatively smooth extension of the current API. The highest value you'd store in the microseconds field is 1e6, and at values around 1e6, double-precision floating point has precision of about 1e-10:
In [8]: 1e6 - np.nextafter(1e6, 0) Out[8]: 1.1641532182693481e-10 So that could represent values to precision of ~0.116 femtoseconds, or 116 attoseconds. Too bad. Femtosecond precision would cover a lot of cases, if you really need attoseconds then it won't work. -n On Thu, May 10, 2018 at 1:30 PM, Ed Page <ed.p...@ni.com> wrote: > Greetings, > > Is there interest in a PEP for extending time, datetime / timedelta for > arbitrary or extended precision fractional seconds? > > My company designs and manufactures scientific hardware that typically > operate with nanoseconds -- sometimes even attoseconds -- levels of > precision. We’re in the process of providing Python APIs for some of these > products and need to expose the full accuracy of the data to our customers. > Doing so would allow developers to do things like timestamp analog > measurements for correlating with other events in their system, or precisely > schedule a future time event for correctly interoperating with other > high-speed devices. > > The API we’ve been toying with is adding two new fields to time, datetime and > timedelta > - frac_seconds (int) > - frac_seconds_exponent (int or new SITimeUnit enum) > > time.microseconds would be turned into a property that wraps frac_seconds for > compatibility > > Challenges > - Defining the new `max` or `resolution` > - strftime / strptime. I propose that we do nothing, just leave formatting / > parsing to use `microseconds` at best. On the other hand, __str__ could just > specify the fractional seconds using scientific or engineering notation. > > Alternatives > - My company create our own datetime library > - Continued fracturing of time ... ecosystem (datetime, arrow, pendulum, > delorean, datetime64, pandas.Timestamp – all of which offer varying degrees > of compatibility) > - Add an `attosecond` field and have `microsecond` wrap this. > - Effectively same except hard code `frac_seconds_exponent` to lowest value > - The most common cases (milliseconds, microseconds) will always pay the > cost of using a bigint as compared to the proposal which is a "pay for what > you use" approach > - How do we define what is "good enough" precision? > - Continue to subdivide time by adding `nanosecond` that is "nanoseconds > since last micosecond", `picosecond` that is "picoseconds since last > micnanosecond", and `attosecond` field that is "attoseconds since last > picosecond" > - Possibly surprising API; people might expect `picosecond` to be an offset > since last second > - Messy base 10 / base 2 conversions > - Have `frac_seconds` be a float > - This has precision issues. > > If anyone wants to have an impromptu BoF on the subject, I'm available at > PyCon. > > Thanks > Ed Page > _______________________________________________ > Python-ideas mailing list > Python-ideas@python.org > https://mail.python.org/mailman/listinfo/python-ideas > Code of Conduct: http://python.org/psf/codeofconduct/ -- Nathaniel J. Smith -- https://vorpus.org _______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/