Neville.

I am totally in agreement with your points that J is easier to teach
because of scripts instead of workspaces, as well as the elegance of tacit
representation. The development environment and syntax of J are significant
improvements over APL.

However, the fact that J's ASCII representation made teaching easier was, I
believe, more due to the difficulty of supporting of the APL character set
for input and output on computing devices, rather than any inherent problem
with the single-symbol representation. There is no problem teaching and
writing APL on a blackboard, or reading it from printed material. The
problem comes with symbol entry and output on computing devices. APL was
designed originally as a *notation*, not a programming language, and the
APL symbols fit that usage perfectly.

When APL was being developed into a programming language, ASCII keyboards
already were supporting two sets of symbols - the upper & lower case
alphabet. To enable the input of APL symbols, the APL developers had to
overload the ASCII keys with yet another set of symbols, which needed to be
included on the key labels (else one had to memorize the key positions of
40-some symbols). Then the developers needed to implement yet another
keyboard mode to select the APL symbols. This is not to mention the
requirement to include backspace/overstrikes to produce certain additional
APL symbols which were not represented on any of the key labels.

Add to all this the requirement to change the type balls on your
acoustically-connected IBM Selectric, or require a special
character-generation ROM in your CRT circuitry to see the symbol output,
and it isn't hard to see why using the pure ASCII character set for symbols
looked so attractive, in the 60s and 70s.

It wasn't until the introduction of the Macintosh with it's graphic display
and software fonts in the mid 1980's, that a practical solution to the APL
character display problem was at hand. Unfortunately, the Mac's
revolutionary mouse wasn't much help in *inputting* the APL symbol set,
thus the rationale for an ASCII-based version of APL was still valid
through the end of the millennium.

As far as the visibility of the period, colon, semi-colon, and comma are
concerned, making the dots bigger would certainly mitigate the visibility
problem. However, your point that the decimal point is used extensively
today to indicate the fractional part of a number doesn't mean that the
decimal point isn't problematic. There is a reason that the Europeans use
the dot to separate thousands (less critical), and the more visible comma
to indicate the fractional part of a number.

I suspect that the *gap* in a regular string of digits caused by the
insertion of the decimal dot has as much to do with the recognition of the
dot's presence, as the actual dot does. Unfortunately, using dots as symbol
modifiers doesn't provide the same break-in-a-regular-sequence clue to
their presence that the dot in a string of numbers does. In addition,
hand-writing bigger dots on a blackboard or paper requires extra effort,
which could be eliminated by writing a single symbol. Generally, I just
don't think that dots are appropriate for carrying information, except as a
separator or ending, such as the period at the end of a sentence. Using it
as a modifier can be dangerous. Not to a machine necessarily, but to the
human reader.

My focus on hand-written notation may be old-fashioned. More and more
communication today is being performed from the keyboard of a computing
device, and less and less via hand-written documents. Perhaps the concept
of a single-glyph symbolic language, which provides the most benefit to
notation and handwriting, is no longer a significant requirement. I know I
find myself handwriting notes less and less, and I haven't handwritten a
letter in years. Perhaps handwriting itself will eventually become
obsolete.....?

Like you, I miss the saltire and obelus for multiply and divide. But I also
miss the he mathematical not-equals symbol (equal sign with a slanted bar
through it) which I feel is a much better foil to the equal symbol (=) than
the nub sieve or tilde colon ( ~:)  which has much less of a graphical
connection with =. And, it's more difficult to see (at least for me).
Generally, if a traditional mathematical symbol fits the functionality of a
language primitive, we should try to use that symbol, rather than inventing
a new one. This seems to be what Ken tried to do in the original APL
character set. Of course, Ken broke that rule with summation, which uses +/
instead of sigma, since the insert notation provides much more flexibility
than the traditional sigma symbol for summation.

Then there is the left arrow, which I feel is a much better symbol for for
assignment than the copula, which to me tends to get confused with some
form of equality.  Global assignment could be a double-shaft arrow (<= but
in a single symbol). Having all four arrow symbols (left, right, up down)
would be a useful addition to the symbol set, as *take* and *drop* verbs
are more graphically suggestive with up and down arrows than braces. The
right-arrow go-to, while denigrated by many programmers, is still a useful
operation. I really love the the APL symbols for rotate and transpose, as
those symbols are hard to beat for giving graphical clues to their
functionality.

In any case, the real problem with any symbolic language which doesn't use
a fairly limited set of symbols like the English alphabet, is the entry of
those symbols into a computing device. For that matter, given the
ubiquitous nature of the ASCII keyboard, *any* language that doesn't use
something close to the English alphabet is at a huge disadvantage. Just
look at the hoops that a Mandarin-speaking native has to jump through, to
type in their language. Or the problems with Farsi character entry, which
has just 32 basic symbols, but each symbol's appearance is somewhat
different depending on it's position in a word.

The good news is that technology now provides us the tools to resolve the
symbol entry issue  The combination of a touchscreen and handwriting
recognition software allows the entry of any arbitrary set of predefined
symbols into a computing device. By simply drawing the symbol on the
touchscreen with a finger or special pen, one can enter a full set of
symbols. A combination of a touchscreen for symbols and a soft ASCII
keyboard for alphanumerics might be the perfect next-generation calculator
(and J machine).

The ubiquity of the built-in microphones in most modern computing devices,
along with the latest speech recognition software, will allow a user to
simply speak symbol names to enter the symbols in the edit window.

So perhaps the best argument for using single-symbols in a programming
language is simply the visual cues that a well-designed symbol provides to
the user. The better the visual hint that a symbol gives of its function to
the user, the better the symbol. Purposely-designed symbols have a much
better chance of communicating a cogent hint about each specific
primitive's function than a re-purposed ASCII character or characters. That
is, unless you want to spell out the primitive's complete name in ASCII.
Actually, that was done some years ago, whenever you couldn't print the APL
character set. It didn't go over very well.

I still believe that the ability to be used as a general-purpose
mathematical notation which can be be easily hand-written is a big plus.

Skip




On Sun, Apr 7, 2013 at 10:17 PM, neville holmes <[email protected]>wrote:

> I have been following the Symbols thread with somewhat
> mixed feelings.
>
> My first acquaintance with APL was in the mid '60s
> when the original box notation was used with spectacular
> success to formally describe the System/360 (see the IBM
> Systems Journal, Vol.3, No.3, 1964).
>
> My first use of APL was with the IBM 2741 terminal run
> from a 360/67 in the early '70s.  The golf-ball print
> head on the 2741 made use of the APL symbols easy, but
> when the 2260 screen terminal came in, the IBM developers
> at Kingston needed a great amount of pressure before they
> would support APL.
>
> When I retired from IBM and went to Tasmania to teach,
> I reluctantly switched to J only because I couldn't get
> the APL interpreter to work on the PCs available to me.
> However I soon realised that J was easier to get over
> to students, partly because of the ASCII character set
> usage, partly because of using scripts instead of
> workspaces, partly because of the possibilities of
> tacit encoding.
>
> This background perhaps explains why the elaborate
> changes being discussed seem to me to be unwise if there
> is any serious hope of (a) keeping existing users all
> happy, and (b) making it easier to bring in new users.
> Also, my 30 years experience as a systems engineer
> taught me that success comes with improvements that
> are incrementally and compatibly introduced.  This
> is a general observation that politicians and bureaucrats
> choose to ignore.
>
> All that said, one improvement I would very much like to
> see in J is the introduction of the saltire and obelus
> (how do I get these symbols into plain text here?) as
> alternatives to the * and % symbols.  That J was forced
> into using * and % instead of the traditional symbols is
> a condemnation of the people who left them out of ASCII.
> Disgraceful !!!
>
> Note that compatibility would be preserved by retaining
> the * and % symbols, but perhaps automatically replacing
> them in displayed J expressions.
>
> This small modification would make it much easier to get
> schoolchildren and other ordinary potential users into
> using J for everyday calculations.
>
> Oh, and someone complained that using the . and : to
> expand the primitive symbol set was bad because they
> are too small to see easily.  Alright then, why not
> simply display them as larger dots when used in primitive
> function symbols?   Mind you, the same problem arises
> with the use of . as a decimal point.
>
>
> Neville Holmes
>
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
>



-- 
Skip Cave
Cave Consulting LLC
Phone: 214-460-4861
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to