li...@openmailbox.org schreef op 29-2-2016 om 07:21:
On Sat, 27 Feb 2016 17:32:19 -0500
Paul Koning <paulkon...@comcast.net> wrote:
Decimal did show up at times even into the 1960s, for example in the IBM
1620. But it never made all that much sense; converting between binary
and decimal is quite easy even in those very old machines. The one
plausible application area is business data processing where the
arithmetic is trivial and most of the work is I/O or other non-arithmetic
operations.
IBM S/360 (1964) and follow-ons have all had hardware support for decimal
and COBOL and PL/I on these platforms have always had native suport for the
data type.
As you might expect decimal arithmetic is used extensively in financial
transactions and reporting since there is no problem of conversion. Money
can be represented exactly rather than approximately as with floating
point. Most banks still run their financial transactions on IBM hardware
and OS for that reason among others.
This binary/decimal discussion stirred up some memories...
By the time Donald Knuth wrote his first volumes "The Art of Computer
Programming", (1968-1970) the binary/decimal discussion was sufficently
alive that he designed his virtual MIX computer (presumably in the years
before publication) explicitly as "binary/decimal agnostic". This
allowed for implementations/simulations on both binary and decimal
platforms.
/Wilm
_______________________________________________
Simh mailing list
Simh@trailing-edge.com
http://mailman.trailing-edge.com/mailman/listinfo/simh