Hi all,

Sorry for the extremely late post on this thread, as I just came across
it, but I can never resist a discussion on different languages' plusses
and minuses!

David is quite right: several brokerages used APL quite heavily. They
would hire people out of business school and throw them into analytics
departments. When I started working with these guys I was aghast at the
idea of using interpreted APL for numerically intensive applications but
APL was by far the fastest way for them to express or revise a numerical
model. Some of them were strong programmers, others not: in one of the
FORTRAN applications I had the dickens of a time explaining why x**2.0
was slower than x**2 (the former requires two function evaluations, the
latter a multiply)  The dialect used was APL2, which permitted things
like nested and heterogeneous arrays (eg: an element of an array could
be another array, or of a simple type different than other elements).
Very powerful, but never my cup of programming tea.  The worst thing
from VM systems point of view was fitting APL's saved segments together
with the programs needed along with it (GDDM, SQL/DS). Maybe now
everything can reside above the 16MB line, but it was a real PITA then.
I imagine some of this code is still in use, but don't really know.

PL/I, on the other hand, I really enjoyed and used very heavily.  A very
ambitious language for the times: it had "real" strings, unlike C,
bounds checking, multitasking, asynchronous and synchronous I/O, an
early form of generic procedures, etc. It also had both COBOL and
FORTRAN numeric models (and bits of their syntax), and both record and 3
different kinds of stream I/O (put/get edit, data, and list). I think
one of the design points was "if there are two ways of doing something,
do them both and make both sides happy") - so things got too big.

Several people have referred to some "surprising" aspects of PL/I's
rules for arithmetic. The canonical case I remember was something like
'x = 25 + 1/3'  Never mind what 'x' is: the problem is that the literal
constants in the example are implicitly FIXED DECIMAL(2,0) and FIXED
DECIMAL(1,0) (eg: 1 or 2 digits, none behind the decimal point). The 1/3
blows up with a fixed overflow. The failure is not silent, and can be
trapped via an ON FIXEDOVERFLOW block, but it certainly violates the
rule Gabe Goldberg taught me about the principle of least astonishment!
The answer was: don't do arithmetic with literals, declare all your
variables to cover the range of possible values, and declare all your
integers FIXED BINARY(31). Other than that, one of the best languages
I've ever programmed in.

Oh, and I stayed at the Bates Inn too. Four of us car-vanned from NY/NJ
to Arkansas via Virginia.  I have to say we gave the locals as good as
we got: remember the Hawaian shirt contest?

cheers, Jeff


David Boyes said:
> > APL is still heavily used by insurance companies to calculate their
> > non-standard insurances for large companies, where the normal routines
> > won't work due to special conditions etc.
>
> Also in a lot of the financial services companies.  I remember a
> presentation from Jeff Savit while he was still at Merrill Lynch about
> supporting some of their real-time brokerage apps which were completely
> done in VS APL on VM. A few years back, but I remember being really
> impressed that something that complicated could actually be written in
> APL. I'm not sure how much of that code survives, but if you wanted to
> give a broker the ability to do some really powerful math on returns or
> such in very few keystrokes, APL would be exactly the right tool to do
> it.
>
> > APL development is very fast compared to other (compiled) languages.
> > I don't do it myself, but I am told so by lots of colleagues.
>
> For what it was designed to do, APL is very, very powerful (eats big
> numeric problems for breakfast, and it's unbelievably concise). It's
> biggest flaws (and IMHO, the things that killed it) were the requirement
> for custom symbol sets on displays and the inability to discuss
> programming in it without having a standard method to note the symbols
> in environments without the special symbol sets. It'd be less of a
> problem today with the prevalence of pixel-addressible displays, but at
> that time, Mathematica and Macsyma were a lot easier to use and
> implement, and didn't require special (and expensive) terminals.

----------------------------------------------------------------------
For LINUX-390 subscribe / signoff / archive access instructions,
send email to [EMAIL PROTECTED] with the message: INFO LINUX-390 or visit
http://www.marist.edu/htbin/wlvindex?LINUX-390

Reply via email to