On Monday, 5 June 2017 at 15:37:42 UTC, Steven Schveighoffer wrote:
It appears that the precision parameter in std.format differs from its meaning in printf. Is that expected behavior?

Example:

import std.stdio;
import core.stdc.stdio;

void main()
{
    auto f = 20.66666;
    writeln(f);
    writefln("%0.3s", f);
    printf("%0.3f\n", f);
}

prints:
20.6667
20.7
20.667

It appears that the precision specifier is dictating the total number of digits on *both sides* of the decimal place. Whereas, in C, it's only the number of digits *after* the decimal place.

I'm trying to specify 3 places of precision after the decimal. How do I do this easily?

I'm having a hard time believing this behavior has never been reported, but I can't find anything about it in bugzilla. Tested all the way back to 2.040.

-Steve

You do realize that you have used "s" in the D version?
This works as expected:

writefln("%0.3f", f); // 20.667
printf("%0.3f\n", f); // 20.667

This is a bit more interesting:

writefln("%0.3s", f); // 20.7
printf("%0.3s\n", f); // 20.

Reply via email to