I wonder what is the historically first programming environment with native
binary floating point which had been proved/demonstrated to handle f.p.
binary<->decimal I/O conversions 100% correctly?
By 100% correctly I mean that all valid binary representations of floating
point numbers could be, given a format with enough significant digits,
converted to unique text strings, and these strings could be converted back
to the corresponding unique binary representations.

Of course, there is enquire.c which facilitated finding bugs in the
Unix/Posix environments, but has anyone cared about this during the
mainframe era?

(I've been playing with the BESM-6 floating point conversions, and the
results are shameful.)

Simh mailing list

Reply via email to