Keith:  No, COBOL is just another computer language.  Early on, most
computer users were business users, and decimal I/O is standard for all
internationally accepted currencies.  Some early computers, like the 8-bit
VAX (remember the 11/78??? The CDC 3100 series?) actually had decimal
arithmetic hardware, or appeared to have it if you looked at the instruction
set, which was there to support COBOL compilers.  Some most of them may have
been fixed point machines that multiplied and divided by 100 on I/O or, if
they had decimal registers or storage, when the hardware read or wrote these
registers or storage.  In the gizzard, they were all just computers, and
some of them were limited to fixed point for decimal arithmetic; I suppose
that there are examples that actually implemented decimal arithmetic like an
old mechanical adding machine or business calculator.  This is so far back
and removed from my personal experiences that I don't know what they did to
implement interest rate computations such as P*(1+i/12)^(12*N) for computing
the future value of a principal amount P at a time N years in the future
with an annual interest rate i computed monthly, but clearly the simplest
thing to do is to put everything in IEEE floating point, compute the result,
and convert back to decimal, and that's what everyone does, whether the user
knows it or not; I don't know COBOL but I suspect that all this is concealed
in the code that supports the COBOL keywords or intrinsic functions - which
is what I was suggesting to the people who asked.

COBOL was one of the first short-learning-curve HOLs for non-technical
people.  It was designed for financial applications.  It's an acronym for
COmmon Business Oriented Language.  It's still a living language; COBOL 2002
incorporates some applicable modern computer science features like
object-oriented programming.  The international standard for COBOL is
IOS/IEC 1989 (the number is a serial identification number, not a year or
date).  Another full revision, including a set of object oriented collection
class libraries, is imminent.  The Wikipedia page on COBOL makes interesting
reading:
        http://en.wikipedia.org/wiki/COBOL
Note that representation of numbers may be IEEE floating point or "packed
decimal." Note the External Links for further reading.

I have run into people on bulletin boards that believe that COBOL is the
principal language out there, and that C and Fortran and such are dead and
most new code is in COBOL.  I've see people talk about C that way since
1972, and I've talked to Brian Kernighan as late as the 1990's and he
believes that C/C++ is the only language necessary and all other languages
should be put behind us.  I know for a fact that much of academia has been
teaching C with the "Highlander" there-can-be-only-one background for two
generations now, but there are too many people who know more than one
language and quietly do what they think is best at any given time, and the
number of C programmers finally began to decline in the 1990's, partly
because its successor, C++, had a poor learning curve, no common extant
self-teaching aid, and proved difficult and expensive to maintain (according
to a late 1990's DoD study, C++ was the most expensive language to maintain,
ADA was the least expensive), all issues that do not apply to ANSI C, K&R C,
or C99 and its successors.

I've encountered hard statements like "Fortran doesn't use the stack" and
"You can't make system calls from Fortran" and such since the 1970's, and
encountered far worse early in the game such as "That's FORTRAN for you.
You should just junk the program and start over in C" when I reported a bug
in console I/O.  When I talk about Fortran 90 and later, I've had senior
people that I respect in academia seek assurance from me that there was no
revision after Fortran 90 and that Fortran is now dead.  And, at the IEEE
FFT 2010 conference at U. Maryland College Park, I had one senior fellow go
breathless when I mentioned that Fortran 2003 incorporated multiprocessor
and distributed computing in the standard as part of the back end in the
language description, and that there was competition in the Fortran
community between Open MP, co-arrays, and vendor-designed multiprocessor
support with few or no added keywords or intrinsic for the user to deal
with; he then, noting my accent, asked where I was born, then attacked me as
a racist because I was born in Texas (I gave him a polite history lesson and
changed the subject).

Myself, I have had to be able to read and modify programs in just about any
popular language out there because of my work over the years, and it has
helped me to break down defenses of leave-me-alone-I'm-the-wizard-here
deadwood from obstruction or delay of several projects.  These are project
management skills, not programming skills.  I'm proficient at C, near expert
at Fortran 95/03, good at Pascal, qbasic and several assembly languages
(Z80, Intel 32, 68000), and passable at JOVIAL, ADA, C++, - and an author of
RATFOR distributions for the TRS-80 and IBM PC for DOS and OS/2.  If
necessary, I will learn to read a new language passably in an hour or two
and be able to modify it, with a decent reference at my elbow, in a day -
I've done that from time to time with APL, LISP, and other self-obfuscating
languages.

Please don't take this post as a language argument; I've seen them all.  If
one language would work well for all of us, we would all have been speaking
Esperanto since 1887 or so.

But, hey, if decimal arithmetic gets any market share at all in the COBOL
community, then, hey, competition is a good thing, maybe COBOL will improve
as a result, C++ decimal classes will improve too, and the users win.  And,
a lot of people other than COBOL environments will benefit from decimal
arithmetic classes and built-in support of business calculations like
present value, sinking fund computations, etc.

James K Beard

-----Original Message-----
From: NightStrike [mailto:[email protected]] 
Sent: Saturday, April 09, 2011 2:41 AM
To: [email protected]; [email protected]
Cc: James K Beard; Jim Michaels
Subject: Re: [Mingw-w64-public] mingw-w64 Decimal Floating Point math

On Sun, Apr 3, 2011 at 7:07 AM, James K Beard <[email protected]>
wrote:
> A quick glance through the document seems to tell us that the decimal
> arithmetic will incorporate checks to ensure that any rounding in binary
> floating point does not compromise the accuracy of the final decimal
> result.  That’s pretty much what I was suggesting in my message of March
26
> below.  The JTC1 Committee is apparently considering putting it in the
> standard.  This could be a very good thing for people porting code from
> COBOL, and useful for new applications in environments previously
restricted
> to COBOL such as the banking and accounting industries.

I'm being a little OT here, but I'm curious.. does that mean that
COBOL was a language that gave very high accuracy compared to C of the
day?



------------------------------------------------------------------------------
Xperia(TM) PLAY
It's a major breakthrough. An authentic gaming
smartphone on the nation's most reliable network.
And it wants your games.
http://p.sf.net/sfu/verizon-sfdev
_______________________________________________
Mingw-w64-public mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/mingw-w64-public

Reply via email to