When I saw Greg's original post I was about to to reply:

шну шоцгд уоц шаит то до тнат?

when I thought: no that's rude and dismissive, so I didn't send it. I said
to myself let's see what emerges.

Now I'm wondering more and more if my first impulse wasn't opportune. But
not to rubbish Greg's basic proposal -- rather to invite yet another
perspective.

I'm struck by the replies from Neville Holmes and John Baker. I sense the
presence of beings who move in my world. Otherwise, alas, I haven't seen
much here to help solve one of the biggest problems of writing courseware
for electrical engineers or lab technicians: to use a computer to solve
problems for which their textbooks give pencil & paper solutions. It's
something I've had to do a lot of, in pre-unicode days. Then as now it was
a massive problem for industry. After the Big Banks, electrical engineers
are perhaps The most important users of computers. (I'm omitting mention of
statisticians -- a whole new story.)

Sitting back and looking at today's computers, I see wonderful technology
which addresses problems which were the pipe-dreams and preoccupations of
engineers back in the 1970s. But in deploying the Answer, the industry has
forgotten the Question (or at least its precise formulation).

Take the problem of training lab technicians and electronics engineers.
It's of massive economic importance to train them well in mathematical
methods, plus using computers in their work. And boy! don't they need
training! Industry (and health) runs on the back of lab technicians, but
they're largely recruited from people who aren't computer or math geeks.
Why? Because if they were they'd earn heaps more in a "better" job. I don't
mean a more vital job: a higher status one.

Math Methods. Computer Use. Two worlds which scarcely touch, let alone
overlap. Yet we expect technicians, the workhorses of industry and health,
to have one foot in each world... when everybody of higher status occupies
one world or the other -- and has had 70 years to make a dog's-breakfast of
bridging them.

I'm old enough to remember why. Any sort of "engineering" was said to get
dirt under the fingernails. I remember the horror on my wife's face when I
came home from work at IBM Hursley and said I was no longer a Mathematician
but an Engineer. (Higher salary -- but IBM was funny like that.)

Now I'm a recent owner of Mathematica. And I have to say that's one bit of
software that does bridge the worlds -- at least it makes a valiant effort.
And it's clear from the publicity material I get from Wolfram it's the
growing tips of its business. It's no longer just a dandy tool for a uni
math department to keep locked away somewhere safe.

Another thing I've owned (early 80s) is a Commodore PET. The second
mass-market micro after the Apple ][. And what was its flagship connector
round the back? An IEEE-488 interface -- which proclaimed it as a lab
instrument par-excellence!

Which was what made the keyboard noteworthy, and the built-in BASIC, which
was its operating system and prime coding language. The keyboard let you
type everything you needed without touching the shift-key (or command key,
option-key, control-key, fn-key, or chords thereof). Unless you had a
pressing need for mixed-case: like you were writing a book or something.
(Apple ][ only had uppercase.)

Guess what one of the keys was? Pi. Yes, π. And the built-in BASIC accepted
it.

So you could write: a=4*π*r^2 -- which was a bold start for a bridge
between the Two Worlds. But it wasn't a=4πr² --which is what "Math Methods
for Technicians" said (and which no language processor I know can execute).
Why even a=4.π.r^2 would have been okay (technicians can get their heads
round the problems of offering both 2 and ² on a keyboard) -- but hey! --
Dot  was reserved to mean the decimal point (in USA, though not in Germany).

...No, I'm not advocating 3 or 4 different Dots (or 5 in the context of J).
That would be like the Dyalog APL Atomic Vector (=charset) that famously
offered two slightly different symbols for '^'.

The BASIC interpreter was written by a squitty little garage startup called
Microsoft. Over the years they've delivered Answers in profusion but
forgotten the Question. Well at least they forgot the questions that pop-up
when you say "math" and "computer" in the same breath.

They didn't forget the question for which the answer is: The Big Banks.

So, in the Great Work Of Bridging the Worlds, what has Unicode to be proud
of?

For starters, it's made a pig's-ear of Pi...
http://www.unicode.org/mail-arch/unicode-ml/Archives-Old/UML011/0166.html

...not to mention Mu -- μ to engineers, or u (because the typing pool could
type that).

Observe please, in J:
   u: 181 956
µμ

...but that's not Unicode's fault. It inherited Mu from superascii.

Why was Mu in superascii? 'Cos a huge vital industry needed it, even more
than it did Pi. Mu gets a code point of its own in the Latin-1 Supplement
series (0080-00FF). Mathematical Pi, from its flying start on the PET, has
to wait in-line with JRR Tolkien's Tengwar for Plane 1 Unicode to come
along (1D6D1). Which J doesn't support. Meanwhile anyone wanting to typeset
Pi needs to use the Greek alphabet. This has its own series separate from
the 9 distinct "math" Unicode code-tables: U0080[latin1-supp], U1D400[math
alpha], U2A00[supp math], U27C0[misc math], U2150[number forms], U2200[math
ops], U2300[misc tech +APL], U2980[misc math-B], UA720[latin ext-D] (for
infinity: ꝏ) -- and that doesn't include a quiverful of arrows tables.

And what has APL and J to be proud of?

pi=.o.1       NB. APL: pi←○1   Elect.Eng: π
pi2=.o.2      NB. APL: pi2←○2  Elect.Eng: 2π
twopi=.2*o.1  NB. APL: twopi←2×○1   Elect.Eng: 2π
piby2=.-:o.1  NB. APL: piby2←2÷⍨○1  Elect.Eng: π/2
tenp1mu=.10.1e_6   NB. APL: tenp1mu←10.1E¯6  Elect.Eng: 10.1µ
neginf=.__    NB. APL: neginf←⌈/⍳0  Elect.Eng: -ꝏ

And what's APL and J's programme to bridge the Two Worlds? Rewrite all the
introductory math texts for engineers in APL? (Or is it to be J?)

I could go on. But won't.

BTW. Who on this list has actually taught computer use to a class of
electronics engineers -- and had to face the rotten tomatoes? Every piece
of equipment they learn to use: signal generator, logic analyser,
multimeter, even a scientific calculator... is labelled in their own
concise, fairly consistent terminology.

Every piece except a "personal" computer.



On Wed, Apr 10, 2013 at 6:57 AM, Ian Clark <[email protected]> wrote:

> Paul -- that was one of my first feelings watching this thread. Thanks to
> my mundane experience of training materials involving both APL and J, I was
> drawn into the problem of productively typing J code with APL comments.
> This drove me to produce these tools:
>
> An APL to J Phrasebook
>    http://www.jsoftware.com/jwiki/APL2JPhraseBook
>
> An APL palette for on-screen "typing" of APL chars
>    http://www.jsoftware.com/jwiki/IanClark/AplPalette
>
> There's also work-in-progress on a script to convert APL )OUT -output into
> rough-and-ready J code to help me port a host of my ancient APL+Win wss
> into J. This tool takes a line of an APL fn, eg:
>
> c←a⌹b
>
> and converts it to:
>
> c=. a %. b      NB. c←a⌹b
>
> No, it won't ever handle everything, but even now it gives me a flying
> start, flagging where it fails. It depends for its usefulness on the fact
> that most (of my practical) APL wss are corny, with only a line here and
> there doing anything smart. If anyone else has a consuming need for such a
> tool, tell me, and I'll up its priority to tart it up and release it.
>
>
> On Tue, Apr 9, 2013 at 1:15 AM, Paul Jackson <[email protected]> wrote:
>
>> There are several modern APL products which support unicode input and
>> output.  While several approachs to input have been developed, most
>> operating systems support display and printing of APL characters in a
>> couple of fonts which are widely available.  They are
>>    Apl385.ttf and SImPL medium APL.ttf
>>
>> Paul
>> On Apr 8, 2013 11:10 AM, "John Baker" <[email protected]> wrote:
>>
>> > General symbols will slowly creep back into dozens of programming
>> languages
>> > in the coming years. We are already seeing this in Mathematica and
>> Unicode
>> > based APL's.
>> >
>> > Unicode is a sprawling beast but it has clearly addressed the symbol
>> > encoding issue. What it has not addressed is the usable font issue.
>> While
>> > you can find all the traditional mathematical symbols, APL and much
>> more in
>> > Unicode you will find little help displaying and printing them without
>> > designing and implementing your own fonts. The hard stuff is always
>> left as
>> > an exercise for the user.
>> >
>> > Despite this problem stick to Unicode code points for your symbols and
>> > don't limit yourself to the characters found in established fonts. I'd
>> also
>> > ignore keyboard issues. Keyboards are already virtual on phones and
>> tablets
>> > and before long QWERTYUIOP keyboards will join card punches in the ever
>> > expanding warehouse of obsolete computer memorabilia.
>> >
>> >
>> > On Mon, Apr 8, 2013 at 12:21 AM, Greg Borota <[email protected]> wrote:
>> >
>> > > As a proof of concept for what we were talking about, I could have
>> > saltire
>> > > and obelus included in my VS based J interactive (term) window. For
>> > these I
>> > > don't need much specialized fonts and such, I think. Also we can have
>> the
>> > > dot bigger indeed, such a great but simple idea. These can be some
>> > simpler
>> > > changes on which further ones could be built incrementally later on,
>> if
>> > so
>> > > decided. I can't wait to have this stuff working.
>> > >
>> > > Thank you so much for giving this background. It's always better to
>> learn
>> > > from the experience of the wise and not repeat previous mistakes.
>> > >
>> > >
>> > > On Sun, Apr 7, 2013 at 10:17 PM, neville holmes <
>> [email protected]
>> > > >wrote:
>> > >
>> > > > I have been following the Symbols thread with somewhat
>> > > > mixed feelings.
>> > > >
>> > > > My first acquaintance with APL was in the mid '60s
>> > > > when the original box notation was used with spectacular
>> > > > success to formally describe the System/360 (see the IBM
>> > > > Systems Journal, Vol.3, No.3, 1964).
>> > > >
>> > > > My first use of APL was with the IBM 2741 terminal run
>> > > > from a 360/67 in the early '70s.  The golf-ball print
>> > > > head on the 2741 made use of the APL symbols easy, but
>> > > > when the 2260 screen terminal came in, the IBM developers
>> > > > at Kingston needed a great amount of pressure before they
>> > > > would support APL.
>> > > >
>> > > > When I retired from IBM and went to Tasmania to teach,
>> > > > I reluctantly switched to J only because I couldn't get
>> > > > the APL interpreter to work on the PCs available to me.
>> > > > However I soon realised that J was easier to get over
>> > > > to students, partly because of the ASCII character set
>> > > > usage, partly because of using scripts instead of
>> > > > workspaces, partly because of the possibilities of
>> > > > tacit encoding.
>> > > >
>> > > > This background perhaps explains why the elaborate
>> > > > changes being discussed seem to me to be unwise if there
>> > > > is any serious hope of (a) keeping existing users all
>> > > > happy, and (b) making it easier to bring in new users.
>> > > > Also, my 30 years experience as a systems engineer
>> > > > taught me that success comes with improvements that
>> > > > are incrementally and compatibly introduced.  This
>> > > > is a general observation that politicians and bureaucrats
>> > > > choose to ignore.
>> > > >
>> > > > All that said, one improvement I would very much like to
>> > > > see in J is the introduction of the saltire and obelus
>> > > > (how do I get these symbols into plain text here?) as
>> > > > alternatives to the * and % symbols.  That J was forced
>> > > > into using * and % instead of the traditional symbols is
>> > > > a condemnation of the people who left them out of ASCII.
>> > > > Disgraceful !!!
>> > > >
>> > > > Note that compatibility would be preserved by retaining
>> > > > the * and % symbols, but perhaps automatically replacing
>> > > > them in displayed J expressions.
>> > > >
>> > > > This small modification would make it much easier to get
>> > > > schoolchildren and other ordinary potential users into
>> > > > using J for everyday calculations.
>> > > >
>> > > > Oh, and someone complained that using the . and : to
>> > > > expand the primitive symbol set was bad because they
>> > > > are too small to see easily.  Alright then, why not
>> > > > simply display them as larger dots when used in primitive
>> > > > function symbols?   Mind you, the same problem arises
>> > > > with the use of . as a decimal point.
>> > > >
>> > > >
>> > > > Neville Holmes
>> > > >
>> > > >
>> ----------------------------------------------------------------------
>> > > > For information about J forums see
>> http://www.jsoftware.com/forums.htm
>> > > >
>> > > ----------------------------------------------------------------------
>> > > For information about J forums see
>> http://www.jsoftware.com/forums.htm
>> > >
>> >
>> >
>> >
>> > --
>> > John D. Baker
>> > [email protected]
>> > ----------------------------------------------------------------------
>> > For information about J forums see http://www.jsoftware.com/forums.htm
>> >
>> ----------------------------------------------------------------------
>> For information about J forums see http://www.jsoftware.com/forums.htm
>>
>
>
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to