On Thursday, 16 May 2019 at 20:17:37 UTC, Steven Schveighoffer
wrote:
We do have a nanosecond resolution, and it's just rounded down
to the nearest 10.
For example:
auto d = 15.nsecs;
assert(d == 10.nsecs);
I'm not sure how to feel about this. Maybe there was a better way
to handle nanoseconds here.
You shouldn't be relying on what a string says to know what the
tick resolution is.
I don't like that with your proposal, it seems to add data that's
not there. The 0 is entirely fictional. It might as well be part
of the format string.
For example, if I do writefln("%f", 1.0), I get 1.000000.
%f is a C-ism, %s does not do that.
hnsecs is more confusing than nanoseconds. People know what a
nanosecond is, a hecto-nano-second is not as familiar a term.
Agreed, which is why Duration.toString shouldn't be used to
present durations to users.
Developers, however, are expected to know what a hectonanosecond
is, same as with all the other technical terms.
If the output is meant for the user, then hectonanoseconds or
nanoseconds are going to be almost always irrelevant. The
duration should be formatted appropriately to the use case.
Depends on the user and the application.
If the durations are so small or so precise that it makes sense
to display them with such precision, then yes, applications would
do better to use nanoseconds instead of hectonanoseconds.