William said:
If we'd used prime and double-prime the problem would, I think, still be
there, even though prime etc. are much bigger punctuation.

Skip replies:
That points up another problem with 2-character primitives. When writing J,
one has to be very careful about placing the periods and colons. If one
leaves just a little too much space between the first character and the dot
or colon, the connection is visually lost. It often looks like the period
or colon is attached to the subsequent characters. Even if the resulting
text isn't syntactically correct, the cognitive load needed to re-connect
the correct symbols and correct the syntax shouldn't be necessary. Dots or
double-prime modifiers, they all have the same problem - they aren't
connected to the initial glyph, and it's awful easy to leave a bit too much
space in there.

Ideally a written version of the language should be space-independent. By
using single all-connected glyphs which are NOT part of the ASCII character
set for all the primitives, the space problem is solved nicely, and you
don't overload the ASCII character set yet again..

This discussion also helped remind me of another issue that may be fueling
some of the current discussions - J's target audience. I have primarily
been a system architect in my career, and I used APL and later J, primarily
to prototype various systems and algorithms. The languages' brevity and
fast-prototyping characteristics were very useful in testing out multiple
approaches quickly, and then for narrowing down the options. The final
production code was never APL or J, it was usually C,  C++, Java, or even
assembly. So I never thought of APL or J as a programming language. To me
It was always a *prototyping* notation which I could execute, in order to
check my designs.

Once I had a firm design in place, I would spend considerable time
explaining the system or algorithm to the programmers. I would write the
APL/J on a blackboard (later whiteboard) and explain the details of the
design as i went through the APL/J code. All the focus on fancy programming
environments and development tools were for the production coders, who had
to deal with all those lower-level languages.

I believe that this type of usage was the original goal of Iverson's APL --
a system design and description notation - rather than a programming
language. APL's single-glyph primitives were the natural solution to
maintain brevity in his notation.

Once APL was implemented as a programming language, it began being pulled
in two directions - the notation/system-design path, and the programming
language path. Today, J has gone a long way down the programming language
path, and has backed away from the pure notation/prototyping path. I
believe that a single-glyph option for J would return J to the
notation/system-protyping path, without having to leave the programming
road.

William said:
I've found J no harder than any other language I'm lazy about. APL's
learning curve is extremely steep because you have to use symbols that you
don't know how to say or type. At least you can speak and write J

Skip replies:
Good point. If J defined the names of its primitives as "left
curly-brace-dot", or "tilde-colon", primitive names would be easy to
remember. Unfortunately, J wants one to remember these primitive names as
"Head" or "Take"  and "nub-sieve" which has no relation to the actual ASCII
characters in the symbol. So the memorization problem is made even worse.
At least with single glyphs, there could be one defined name for the glyph
in all usages.

William said:
I disagree that the limitations have vanished.

Skip replies:
Well, at least the hardware limitations in the 80's that prevented the
input & display of arbitrary glyphs are rapildy vanishing. The software
limitations that were built into our communications infrastructure and
devices years ago are still hanging around, though they are slowly getting
removed. Ubiquitous unicode support would go a long ways towards clearing
up the last of these issues.

William said:
But as things stand, my experience is that a phone is a very unpleasant way
to do large amounts of input. I'm not really sure that your argument here
is valid; I think you're assuming we're going to work miracles. No offense,
I hope... Do you have any specific ideas?

Skip replies:
I was using the smartphone as an example of where *all* computing devices
are headed in the next few years. I didn't mean to imply that one would use
a phone for complex programming tasks.

Some of the new tablets are big enough that they could support a
near-fullsized soft keyboard, with enough space left over for a
reasonably-sized program display. Conceivably, one could do some serious
programming with this setup.

In the near future, I could see two-piece devices, consisting of a
touchscreen big enough for a full-sized soft keyboard that lies flat on the
desk, and another touchscreen set up vertically for the display. That would
allow the user to type on the soft keyboard, and the key labels could
change to display the glyph sets as needed. Or, the touchscreen keyboard
could be used as a writing pad with handwriting recognition, to enter the
glyphs. With current technologies, the possibilities are practically
unlimited.

Skip



On Thu, Apr 11, 2013 at 2:20 AM, William Tanksley, Jr <[email protected]
> wrote:

> Skip Cave <[email protected]> wrote:
> > 1. The small dots used for modifying the base ASCII characters in J are
> > hard to read, and can cause confusion. Making the characters bold can
> help,
> > but only on a computer. When writing J on paper or on a blackboard, the
> > small dots still often get lost in the mix. Having to darken each dot as
> > you write it, tells me that you need a better symbol set for writing J.
> > This may not be an issue for J programmers, but it is an issue for J
> > teachers.
>
> I don't know if this is true. However, I don't know how to test it. I
> think the bigger problem is probably the large primitive set, rather
> than the dots. If we'd used prime and double-prime the problem would,
> I think, still be there, even though prime etc. are much bigger
> punctuation.
>
> > 2. Unlike most APL symbols, many J symbols don't have any graphical cue
> to
> > their function. That is one of the reasons the J learning curve is so
> > steep.
>
> I just don't buy it. I tried APL and gave up; I've found J no harder
> than any other language I'm lazy about. APL's learning curve is
> extremely steep because you have to use symbols that you don't know
> how to say or type. At least you can speak and write J...
>
> But I'm not sure about my position there.
>
> > J newbies have to
> > memorize 120-some symbols, most of which don't provide much of a clue as
> to
> > their underlying functionality.
>
> Of course, there's a lot of truth in what you say. I'm sure you're
> right that providing glyphs that give SOME clue of what they do will
> help. You're convincing me :-).
>
> > 3. Groups of J primitives which have graphical similarity, don't
> > necessarily have similar functionality ($ $. $:). So this makes it
> > doubly-hard to remember what each symbol does.
>
> Straight-up true.
>
> > All this negativity isn't meant to imply that Ken & Roger did a bad job
> of
> > picking the J symbol set. They did a marvelous job, given the limited
> > options for entering symbols, and the ASCII characters they had to work
> > with, in the 1980s. They gave up the elegance of single-glyph APL, for
> the
> > pragmatic rationale of cross-platform support. The main point now, is
> that
> > the limitations they faced back then that forced their choice, are now
> > fading away.
>
> We should also admire K's elegance in this domain. I didn't bother
> learning K (it's not open source), but it seems to use overloading to
> allow the same symbol to do different things to different types of
> data. (I'm not certain.)
>
> > Various suggestions have been proposed in this forum to address one or
> two
> > of these issues, but I haven't seen much to address all of them. We
> simply
> > need to take fresh look at the symbols J uses, realizing that the
> > limitations that caused those 1-2 character ASCII symbols to be there,
> have
> > gone away.
>
> I don't think anyone HAS tried to address all the weaknesses; but I
> don't think that's possible at our current state of understanding. I
> disagree that the limitations have vanished.
>
> > My current cellphone, a Galaxy Note 2, has a high-res graphics
> touchscreen,
> > graphic pen (with case insert), multiple soft keyboards, handwriting
> > recognition, built-in mike and speaker, GPS, inertial sensors, and on and
> > on. All the physical pieces are there to support all kinds of input
> > mechanisms and displays, for all kinds of characters. There is even an
> > Android OS option for selecting  default soft keyboards for each specific
> > application, from a list of soft keyboards.
>
> Interesting--- on my old droid (Incredible 2) it allows switching
> keyboards, but not "per application", only one at a time. And I admit
> that the huge variety of inputs DOES suggest that it might solve some
> problems for us... But as things stand, my experience is that a phone
> is a very unpleasant way to do large amounts of input. I'm not really
> sure that your argument here is valid; I think you're assuming we're
> going to work miracles. No offense, I hope... Do you have any specific
> ideas?
>
> Perhaps the old Graffiti system would be like what you're talking
> about. Are you familiar with that? It was introduced on the Palm, in
> competition with the much more powerful and smart system used on the
> Apple Newton; and where the Newton was clever and smart, Graffiti was
> simple and reliable. Doonesbury never bothered to parody it, and it
> enjoyed relative commercial success until Palm died. Graffiti is still
> around as an Android keyboard replacement.
>
> > With the advent of ubiquitous graphical touch screens and soft keyboards,
> > along with handwriting and speech recognition functionality, the hardware
> > limitations that caused Ken & Roger to move away from APL's single-glyph
> > symbol set have been removed. So the time has probably come to revisit
> that
> > choice, given our new reality.
>
> I'm willing to look, but I'm not sure the only limitations are
> hardware. And I'm not sure a cell phone makes a good main programming
> machine -- mine makes a great calculator at best. To call touchscreens
> ubiquitous is ... not accurate.
>
> > Skip
>
> -Wm
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
>



-- 
Skip Cave
Cave Consulting LLC
Phone: 214-460-4861
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to