"[EMAIL PROTECTED]" <[EMAIL PROTECTED]> wrote:
> I know that Ken thought that it was a mistake to use the Greek characters in
> APL. I think it was a mistake to think it was a mistake. But there will of
> course always be those who think it is a mistake to think it was a mistake
> to think it was a mistake...
>
> I miss those Greek characters because they are just as easy to remember as
> road signs. They don't have to be decoded or parsed. That's a good thing
> with road signs.

I also think that the APL character set looks "cool". However, if you use any
letters in any alphabet that already have pre-defined meanings, you are bound
to run into ambiguities if both of those meanings can occur simultaneously.
For example, APL uses alpha=x. omega=y. iota=i. epsilon=e. and rho=$
approximately (not to mention capital Delta), but those Greek letters already
have pre-defined meanings - they are legitimate letters in the Greek alphabet.
This would not normally pose a problem - but if the programmer is Greek and
wants to use Greek letters in his variable names, it can become a great
problem. What would you think of a programming language that allows you
to program in many different languages - Korean, Sanscrit, even Tolkien's
runes - but NOT in Greek?

One way around this is that Unicode has special code-points for many
(but not all) of the Greek symbols; unfortunately, there is nothing that
says that these Greek symbols should look any different than normal
Greek letters, so if you mixed them together, it would be very difficult
to visually discern primitives from parts of an identifier (This can already
be problematic between using I, l, and 1 in some fonts, and can be even
worse if someone mixes Latin A, Greek A, and Cyrillic A, all of which look
identical but have different code points. Then again, doing something like
that is very bad coding practice just ASKING for trouble, but for a Greek
programmer to use real Greek words is something one should naturally
expect to be a legitimate thing to do.)

There is one other problem that happens if you use APL characters for
J primitives. A small number of APL symbols that look like ASCII characters
have been re-assigned, and different symbols mapped onto those ASCII
characters. For example:
  APL x = J * = multiply
  APL * = J ^ = power
  APL ^ = J *. = and
If you allow both APL and J symbols to coexist, what should * and ^ mean?

-- Mark D. Niemiec <[EMAIL PROTECTED]>

----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to