Public bug reported:

On Ubuntu 20.04.1 LTS, glibc-2.31:
gcvt() will output no more than 17 digits of precision.
qgcvt() will output no more than 21 digits of precision.

Here is the demo :

/************************************************/
#include <stdio.h>
#include <stdlib.h>

int main(void) {
 char ebuf[80];

 gcvt(0.1, 55, ebuf);
 printf("%s\n", ebuf);

 qgcvt(0.1L, 67, ebuf);
 printf("%s\n", ebuf);

 return 0;
}

/************************************************/

I got:
0.10000000000000001
0.100000000000000000001

I expected:
0.1000000000000000055511151231257827021181583404541015625
0.1000000000000000000013552527156068805425093160010874271392822265625

The "expected" values are exact base 10 representations of the values
contained in the double 0.1, and in the (80-bit extended precision) long
double 0.1.

The same problem existed on Ubuntu-18.04. I expect it is a longstanding
issue

Cheers,
Rob

** Affects: glibc (Ubuntu)
     Importance: Undecided
         Status: New

-- 
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.
https://bugs.launchpad.net/bugs/1899553

Title:
  gcvt and qgcvt do not always provide requested precsion

To manage notifications about this bug go to:
https://bugs.launchpad.net/ubuntu/+source/glibc/+bug/1899553/+subscriptions

-- 
ubuntu-bugs mailing list
ubuntu-bugs@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to