On 16.05.19 17:55, Vladimir Panteleev wrote:
On Thursday, 16 May 2019 at 15:52:05 UTC, Steven Schveighoffer wrote:
[...]
The output shouldn't involve the inner workings of the type. It should
be changed to say 10 ns.
If the output is meant for the developer, then I disagree subjectively,
as that creates the impression that the lowest resolution or
representable unit of time is the nanosecond.
If the output is meant for the user, then hectonanoseconds or
nanoseconds are going to be almost always irrelevant. The duration
should be formatted appropriately to the use case.
I'd suggest "17 ms, and 553.1µs" for a better default (1 hns is 0.1 µs,
right?). No weird "hnsecs", no false precision, still all the data that
is there.