On 5/16/19 4:55 PM, Vladimir Panteleev wrote:
On Thursday, 16 May 2019 at 15:52:05 UTC, Steven Schveighoffer wrote:
Hecto-nano-second, the smallest representable unit of time in SysTime and Duration.

The output shouldn't involve the inner workings of the type. It should be changed to say 10 ns.

If the output is meant for the developer, then I disagree subjectively, as that creates the impression that the lowest resolution or representable unit of time is the nanosecond.

It is what it is. The reason hnsecs is used instead of nsecs is because it gives a time range of 20,000 years instead of 2,000 years.

We do have a nanosecond resolution, and it's just rounded down to the nearest 10.

For example:

    auto d = 15.nsecs;
    assert(d == 10.nsecs);

You shouldn't be relying on what a string says to know what the tick resolution is.

For example, if I do writefln("%f", 1.0), I get 1.000000. That doesn't mean I should assume floating point precision only goes down to 1/1_000_000.

hnsecs is more confusing than nanoseconds. People know what a nanosecond is, a hecto-nano-second is not as familiar a term.

If the output is meant for the user, then hectonanoseconds or nanoseconds are going to be almost always irrelevant. The duration should be formatted appropriately to the use case.

Depends on the user and the application.

-Steve

Reply via email to