After thinking more about the issue of multiple glyphs vs single glyphs, I
realized that the underlying issue is that we are trying to compare three
very different types of languages here - natural languages, programming
languages, and mathematical languages. Each of these language types has
their own, often very different, requirements.

Natural languages must support thousands, if not millions of concepts.
Natural languages are constantly mutating, with new concepts and words
being invented and discarded almost daily. The invention of alphabets,
which could be used to write natural languages, was a huge step towards
simplifying the teaching and implementation of alphabetic languages, when
compared to logosyllabic languages such as Mandarin.

Ambiguity is rampant in natural languages. Certain words gain multiple
context-dependent meanings over time, and other words change meanings
dramatically, or simply disappear. Our brains have learned to cope with
this situation, as it is important for a natural language to be able to
evolve with the environment and experiences of it's users.

Natural languages have to deal with all kinds of aberrations in the process
of communicating information. Spoken language must overcome individual
speech accents and styles, ambient noise, the receiver's audible acuity,
etc. Written communications must deal with handwriting styles, various
writing surfaces and instruments, the reader's visual acuity, etc. Natural
languages have evolved to help overcome these limitations.

On the other hand, programming languages have very different, almost
opposite, requirements. Changes to a programming language must be carefully
vetted, typically by some central authority. This is done to in order to
verify that the changes won't affect current operations, or at least have
minimal impact. Otherwise, programs that we spent so much effort writing a
few years ago, may suddenly (shudder) STOP WORKING. This invariably causes
various levels of panic among the users of the program, who demand an
immediate fix by the authors (or their descendants), followed by a frantic
debugging session by same.

This fact tends to make the mutation of any programming language a very
slow and deliberate process. Due to the problems with backward
compatibility, if a programming language seems to have too many defects, a
new language is developed, rather than try to fix the old one. It's easier
to develop a new language, than having to deal with the problems one
encounters when making significant changes in an existing language.

A programming language must also have all vestiges of ambiguity scrubbed
out of it's syntax. If there is any ambiguity, the same program may
(double-shudder) .... have different results on different systems!

The good news is that, unlike natural languages, most programming languages
have at most a few hundred basic or primitive concepts that define the
language. This small number of primitive concepts, along with the
low-mutation aspect, help lower the learning curve of most programming
languages, when compared to a natural language. Programs are constructed
from the basic set of non-ambiguous primitive functions, which will
hopefully execute in the same way, everywhere.

The whole concept of "programming language" implies putting programs into a
computer, and getting results out of same. Thus, unlike a natural language
which can be spoken or hand-written, a programming language must have an
unambiguous noise-free way to input and view programs on a computer. Since
the ASCII keyboard and alphabet have been standardized on computers for
many years, (at least in the English-speaking world), it makes perfect
sense that programming languages would adopt that scheme for their use.

Mathematics notation has a much longer history than programming languages,
though math notation does have a few characteristics in common with
programming languages. Basic math notation has been relatively stable over
the last few centuries, though new concepts and notations have been
gradually added over that period. The basic set of core mathematical
concepts/symbols is in the hundreds or so, much like a programming
language. This allowed most of these basic concepts to be represented by
various single-symbol conventions. Also, like a programming language, more
complex mathematical concepts can be communicated with various arrangements
of those basic mathematical symbols.

However, today's mathematical notation has its problems. Ken Iverson covers
this issue quite eloquently in his ACM Turing Award presentation "Notation
as a Tool of Thought" in 1979. I will simply quote his words here:

<<

"The importance of nomenclature, notation, and language as tools of thought
has long been recognized. In chemistry and in botany, for example, the
establishment of systems of nomenclature by Lavoisier and Linnaeus did much
to stimulate and to channel later investigation.

Concerning language, George Boole in his Laws of Thought asserted “That
language is an instrument of human reason, and not merely a medium for the
expression of thought, is a truth generally admitted.”

Mathematical notation provides perhaps the best-known and best-developed
example of language used consciously as a tool of thought. Recognition of
the important role of notation in mathematics is clear from the quotations
from mathematicians given in Cajori’s A History of Mathematical Notations.
They are well worth reading in full, but the following excerpts suggest the
tone:

"By relieving the brain of all unnecessary work, a good notation sets it
free to concentrate on more advanced problems, and in effect increases the
mental power of the race."
A.N. Whitehead

"The quantity of meaning compressed into small space by algebraic signs, is
another circumstance that facilitates the reasonings we are accustomed to
carry on by their aid."
Charles Babbage

Nevertheless, mathematical notation has serious deficiencies. In
particular, it lacks universality, and must be interpreted differently
according to the topic, according to the author, and even according to the
immediate context. Programming languages, because they were designed for
the purpose of directing computers, offer important advantages as tools of
thought. Not only are they universal (general purpose), but they are also
executable and unambiguous. Executability makes it possible to use
computers to perform extensive experiments on ideas expressed in a
programming language, and the lack of ambiguity makes possible precise
thought experiments. In other respects, however, most programming languages
are decidedly inferior to mathematical notation and are little used as
tools of thought in ways that would be considered significant by, say, an
applied mathematician."

>>

I re-read Ken's Turing Award speech every few years, and each time I
remember my excitement in discovering APL (after several Electrical
Engineering college classes using Fortran and Assembly). Back then, I could
envision a revolution in mathematics from grade school onward, using Ken's
notation instead of traditional mathematics notation. Children would learn
math on the blackboard, and slowly be brought to the realization that the
same math they were struggling with in school could be handed over to a
computer, without having to learn a new language. This would "relieve their
brains of unnecessary work, setting it free to concentrate on more advanced
problems", to paraphrase Whitehead.

The key here is that students shouldn't have to learn a completely
different language, just to get their math homework done on a computer. Or
have to learn a new programming language to solve math problems later in
their career. They should be able to simply utilize the notation that they
learned in their mathematics classes in grade school, to formulate and
evaluate math problems that they encounter throughout their life. To do
this, you needed a truly unambiguous, general mathematics notation, and
that is what Ken proposed with APL.

What Ken didn't forsee was the ongoing problem with the APL chatacter set.
When Ken designed APL, he was able to draw on the considerable resources of
IBM to implement his language and character set. Programmers implemented
the executable version of his notation. Mechanical and electrical engineers
designed the Selectric typewriter and it's interchangeable typeballs.
Graphic designers were employed to build the typeball fonts and new
keyboard labels (though they probably weren't called that back then). I
hate to think how much money IBM spent, making Ken's dream a reality.

Unfortunately, the programming world mostly ignored the APL environment,
and instead standardized on a keyboard and character set that excluded APL.
That result wasn't all that surprising, since virtually every other
programming language could simply use the standard Qwerty keyboard and
ASCII character set.

As the basic math concepts were discovered over the centuries,
mathematicians developed notation to represent these concepts, usually as
soon as the concepts were conceived.  The problem was that the notations
they developed were often ambiguous and context-dependent, which required
additional description for clarification. By the mid-20th century computers
were developed which could potentially execute the functions represented
traditional math notation. Unfortunately, traditional math notation had
many ambiguities, and the multi-level equations and arcane symbols of
traditional math notation weren't particularly suited for input on a
computer. Thus programming languages were born.

That's what happened with APL & J. It was the re-purposing of APL as a
programming language, rather than Iverson's original concept of a
mathematical notation, that drove J's conversion to the ASCII character
set. Most of the underlying concepts of APL remained the same in J. The
notation was changed to overcome the input and display limitations that
were present at the time. However, modern computing devices are rapidly
removing those limitations. Some hints as to what is happening today is
shown in the links below:

Your mobile experience is not theirs:
http://tinyurl.com/cgm2f84

MyScript Calculator:
http://tinyurl.com/clj6sk5

The key dichotomy here is "programming language" versus "mathematical
notation". Students learned math notation in grade school. Then they might
learn a programming language at some later time. But those two languages
were treated as two completely different paradigms, rather than just a
natural progression of gradually-increasing notational power. The ideal
programming language should look identical, or at least very similar, to
the notation we learn in school. Then everyone would become a "programmer"
as soon as they wrote their math equations on a computing device.

Skip

-- 
Skip Cave
Cave Consulting LLC
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to