[cctalk] Re: APL (Was: BASIC

2024-05-15 Thread Ken Seefried via cctalk
On Wed, May 1, 2024 at 7:51 PM Fred Cisin via cctalk 
wrote:

>
>
> What would our world be like if the first home computers were to have had
> APL, instead of BASIC?
>
>
>
The Ampere WS-1?

https://en.wikipedia.org/wiki/Ampere_WS-1

Definitely more stylish.

KJ


[cctalk] Re: APL (Was: BASIC

2024-05-08 Thread Mike Katz via cctalk
There is scientific proof that studying music helps with math aptitude 
and vice versa.


On 5/8/2024 9:30 AM, Paul Koning via cctalk wrote:



On May 8, 2024, at 10:25 AM, Harald Arnesen via cctalk  
wrote:

Paul Koning via cctalk [07/05/2024 19.31]:


(Then again, I had a classmate who was taking a double major: math and music 
composition...)

Mathemathics and music is not a rare combination - see Tom Lehrer, for instance.
--
Hilsen Harald

My wife (a voice major) pointed out that instrumental music majors tend to be 
good at math; voice majors not so much.

paul





[cctalk] Re: APL (Was: BASIC

2024-05-08 Thread Bill Gunshannon via cctalk




On 5/8/2024 10:30 AM, Paul Koning via cctalk wrote:




On May 8, 2024, at 10:25 AM, Harald Arnesen via cctalk  
wrote:

Paul Koning via cctalk [07/05/2024 19.31]:


(Then again, I had a classmate who was taking a double major: math and music 
composition...)


Mathemathics and music is not a rare combination - see Tom Lehrer, for instance.
--
Hilsen Harald


My wife (a voice major) pointed out that instrumental music majors tend to be 
good at math; voice majors not so much.


Too bad it isn't reciprocal.  I was great at math from a very early age.
Did Algebra and Geometry in grade school.  I have at least a half dozen
instruments and I suck at all of them.  I can play very technically on
most of them but I have no art and can not put any real feeling into the
music I play.  But then, I only play for myself so I guess it really
doesn't matter./  :-)

bill



[cctalk] Re: APL (Was: BASIC

2024-05-08 Thread Tony Duell via cctalk
On Wed, May 8, 2024 at 3:25 PM Harald Arnesen via cctalk
 wrote:
>
> Paul Koning via cctalk [07/05/2024 19.31]:
>
> > (Then again, I had a classmate who was taking a double major: math and 
> > music composition...)
>
> Mathemathics and music is not a rare combination - see Tom Lehrer, for
> instance.

I am, of course, thinking of the well-known book 'Godel, Escher and
Bach, an Eternal Golden Braid'

-tony


[cctalk] Re: APL (Was: BASIC

2024-05-08 Thread Paul Koning via cctalk



> On May 8, 2024, at 10:25 AM, Harald Arnesen via cctalk 
>  wrote:
> 
> Paul Koning via cctalk [07/05/2024 19.31]:
> 
>> (Then again, I had a classmate who was taking a double major: math and music 
>> composition...)
> 
> Mathemathics and music is not a rare combination - see Tom Lehrer, for 
> instance.
> -- 
> Hilsen Harald

My wife (a voice major) pointed out that instrumental music majors tend to be 
good at math; voice majors not so much.

paul



[cctalk] Re: APL (Was: BASIC

2024-05-08 Thread Harald Arnesen via cctalk

Paul Koning via cctalk [07/05/2024 19.31]:


(Then again, I had a classmate who was taking a double major: math and music 
composition...)


Mathemathics and music is not a rare combination - see Tom Lehrer, for 
instance.

--
Hilsen Harald


[cctalk] Re: APL (Was: BASIC

2024-05-07 Thread Sellam Abraham via cctalk
On Tue, May 7, 2024 at 11:03 AM Chuck Guzis via cctalk <
cctalk@classiccmp.org> wrote:

>
> People are strange--and interesting.
>
> --Chuck
>

But mostly strange :D

Sellam


[cctalk] Re: APL (Was: BASIC

2024-05-07 Thread Chuck Guzis via cctalk
On 5/7/24 10:31, Paul Koning via cctalk wrote:
> 
> 
>> On May 7, 2024, at 1:20 PM, Sellam Abraham via cctalk 
>>  wrote:
>> ...
>>  Thus proving to
>> be complete horseshit all the educators that said if you want to get into a
>> computer career you must be good at math.
> 
> Indeed.

That is, or at least wasn't uncommon at all.  50 years ago, the guy in
the cube across from mine would take 3 months off to go to Vegas and
play trombone.  The guy immediately adjacent to my cube held a doctorate
in piano from IU, but preferred to play clarinet.

I have lots of anecdotes about that type of stuff.

For example, an engineer chum had married a few months prior.  At an
informal gathering (the hostess had a stock of musical instruments and
invited folks to feel free to participate), I was noodling around on the
piano and the guy, who by then, was a little tipsy, sat down alongside
me.  He asked if he could have a turn at the ivories.  Sure, I jokingly
suggested that he give his rendition of the Goldberg Variations--which
he promptly proceeded to do, straight through, right to the final
aria--from memory.

His new wife was in tears at that point.  She had no inkling that he was
a piano performance graduate of Juiliard who found engineering more
interesting than giving piano lessons.  The couple didn't even own a piano.

People are strange--and interesting.

--Chuck






[cctalk] Re: APL (Was: BASIC

2024-05-07 Thread Paul Koning via cctalk



> On May 7, 2024, at 1:20 PM, Sellam Abraham via cctalk  
> wrote:
> ...
>  Thus proving to
> be complete horseshit all the educators that said if you want to get into a
> computer career you must be good at math.

Indeed.

One of the most amazing programmers I ever worked with was a graduate of the 
Berklee School of Music.  And two other quite competent computer people I know 
had Conservatory of Music degrees in piano performance.

(Then again, I had a classmate who was taking a double major: math and music 
composition...)

paul



[cctalk] Re: APL (Was: BASIC

2024-05-07 Thread Sellam Abraham via cctalk
On Fri, May 3, 2024 at 2:48 PM Liam Proven via cctalk 
wrote:

>
> I failed _O_ level mathematics, and to get onto a science degree
> course, I had to do another 6 months of remedial maths just to get me
> through the exam. To be told "easy if you did the A level" would have
> made me angrily walk out in disgust if it wasn't a mandatory course.
>

I sucked at math pretty much throughout school.  I took first level
Calculus three times before I finally got my mind wrapped around the
Fundamental Theorem of Calculus (with a little help from LSD in showing me
infinity).  However, it never prevented me from writing rather involved and
complicated computer programs starting in my teen years.  For college, I
decided I was either going to get into MIT or not go at all, and when I was
told that my math grades would not cut it to get into MIT, well then that
settled it.  I somehow managed to be successful in spite.  Thus proving to
be complete horseshit all the educators that said if you want to get into a
computer career you must be good at math.

That worked. It took me a weekend and was no direct use because at the
> end of about 32-33 hours of work, I could do a chi-squared test by
> hand. So, indirectly, it achieved its purpose.
>

Very nice.  By implementing the algorithm in the computer it planted it in
your mind.  Great story to illustrate how great a mind-building tool the
computer and programming is.

It could be said that my math skills were really lacking in the area of
logic, in that I did pretty OK in Algebra 1 and 2, but barely passed
Geometry as I could never get proofs right, and I ended up with a D in
Trigonometry (but there were extenuating circumstances with that).  It
could have been because I hated homework and studying and Geometry and Trig
stuff requires a lot of memorization.  But I noticed that when I started
going back to school in my 30s towards earning a degree my years of
programming had structured my mind to where I could break down problems
into little steps and then solve each individually until I arrived at the
final answer, much like how one writes a computer program.  About a dozen
years ago I took up law as a "hobby" and threw myself into that.  It turns
out programming and law share very similar constructs: a pleading is like a
program; the rules of court are the syntax of the basic "programming
language" in which the pleading is written; case law are like library
calls; and the court is the computer.  You submit your program/pleading to
the computer/court and it runs...or not, and crashes.  And just like a
computer, I started to figure out ways that the court can be hacked ;)

Anyway, I noticed that when I went back to recreational 6502 programming a
few years ago, not only did everything come back to me relatively quickly
from my teens, but I found that I was just much better at assembling code,
which I attribute to my law hobby.  It then began to seem that my practice
of law got better after I spent long periods writing 6502 code.  Each lent
themselves towards helping me get better at the other.  It was awesome
synergy.

But the point is: not everyone can do "high school algebra." I do not
> know what age "high school" means to you but very basic secondary
> school algebra was _extremely_ hard for me and took years of real work
> to master.
>
...

> "It's as easy as algebra" is reinforcing my point about this stuff
> _not_ being easy, natural, obvious, helpful, convenient, clear,
> meaningful or useful for most people.
>

I think it has more to do with just learning the construct, or the
language, of the subject.  Once you get over that hump, one's basic
intelligence then fills in.  Every subject has its particular jargon and
principles, and having basic competency in that subject is really in
memorizing the jargon and principles so you can properly "speak in the
language" of the subject.


> I wrote an article about 3 new BASIC releases for its 60th anniversary:
> https://www.theregister.com/2024/05/03/basic_60th_birthday/
>
> Do go read the first comment.
>
> It shows how BASIC was immediately apprehensible and memorable in a
> way that APL never would be.
>

It was hard going back to BASIC programming after programming in C and
other higher level languages for some many years, but once I got back into
it (Applesoft specifically) and remembered all the structured programming
techniques my high school computer science teacher forced us to learn, and
then applied my years of subsequent experience in crafting and organizing
programs, the limitations became easy to overcome.  It's fun to overcome
the limitations of a simple language and still make a good program, and
where I need some specific functionally I desire, Applesoft is extensible
through the & "command", which is more of a machine language bridge in that
the BASIC interpreter will jump to a user-definable vector at which you put
your custom token interpretation code.  For example, I made an entire low
resolution 

[cctalk] Re: APL (Was: BASIC

2024-05-03 Thread Liam Proven via cctalk
On Thu, 2 May 2024 at 20:51, Lee Courtney  wrote:

> Too bad because the language itself lends itself to learning by anyone with 
> an understanding of high school algebra.

You remind me -- and _not_ in a good way -- of the first day of my
undergrad 1st year statistics course at university. I did biology and
we had a mandatory stats course.

The lecturer came on stage and said (roughly, this was ~40 years ago)

"Now I know many of you don't want to be here, or are nervous or
apprehensive. I just want to reassure you. Don't be. This course is
easy stuff, and it will be basically revision for anyone with A-level
mathematics. You'll be fine."

I failed _O_ level mathematics, and to get onto a science degree
course, I had to do another 6 months of remedial maths just to get me
through the exam. To be told "easy if you did the A level" would have
made me angrily walk out in disgust if it wasn't a mandatory course.

As it was, I worked out that the only test I needed was a Chi-squared
test. I had no idea how to do it and the explanation was, well, all
Greek to me. But my friend did get it, and he helped break it down
into very small simple steps for me, while I wrote a Sinclair BASIC
program to not merely do a chi-squared test _but to print out all the
intermediate working as if I had done it by hand_ so I could copy it
down longhand and fake being able to do it in my coursework.

That worked. It took me a weekend and was no direct use because at the
end of about 32-33 hours of work, I could do a chi-squared test by
hand. So, indirectly, it achieved its purpose.

But the point is: not everyone can do "high school algebra." I do not
know what age "high school" means to you but very basic secondary
school algebra was _extremely_ hard for me and took years of real work
to master.

And yet, I have a degree and at the time I got it I scored about 150
on the Mensa IQ test. I am not daft.

In real life, for ordinary people, algebra is a byword for "really
hard to understand".

As Stephen Hawking wrote in _A Brief History of Time_

«
Someone told me that each equation I included in the book would halve
the sales. I therefore resolved not to have any equations at all. In
the end, however, I did put in one equation, Einstein's famous
equation, E = mc squared. I hope that this will not scare off half of
my potential readers.
»

I think it did not help. (I found the book very dull, myself. I
already knew what he was trying to explain.)

"It's as easy as algebra" is reinforcing my point about this stuff
_not_ being easy, natural, obvious, helpful, convenient, clear,
meaningful or useful for most people.

I wrote an article about 3 new BASIC releases for its 60th anniversary:
https://www.theregister.com/2024/05/03/basic_60th_birthday/

Do go read the first comment.

It shows how BASIC was immediately apprehensible and memorable in a
way that APL never would be.

Translation for American readers:
"O" level -- school exams at about age 16; you normally do about 8
subjects. I did 12.
"A" level -- school exams at ~18, necessary to get into university.
You normally do 3. I did 5.

-- 
Liam Proven ~ Profile: https://about.me/liamproven
Email: lpro...@cix.co.uk ~ gMail/gTalk/FB: lpro...@gmail.com
Twitter/LinkedIn: lproven ~ Skype: liamproven
IoM: (+44) 7624 277612: UK: (+44) 7939-087884
Czech [+ WhatsApp/Telegram/Signal]: (+420) 702-829-053


[cctalk] Re: APL (Was: BASIC

2024-05-02 Thread Paul Koning via cctalk


> On May 2, 2024, at 8:45 PM, Paul Koning  wrote:
> 
> Yes, it sure is.  I was mistaken about it being the first issue.  Instead, 
> the RSA article appears in Vol. 1 No. 3 (4Q80).  Too bad the article itself 
> isn't included in the scanned material.

Ah, but it does show up elsewhere: 
http://ai.eecs.umich.edu/people/conway/VLSI/ClassicDesigns/RSA/RSA.L4Q80.pdf 


> 
>   paul
> 
>> On May 2, 2024, at 8:39 PM, Lee Courtney > > wrote:
>> 
>> Paul,
>> 
>> Is this the Lambda/VLSI Design magazine you refer to:
>> 
>> Lynn Conway's VLSI Archive: Main Links (umich.edu) 
>> 
>> 
>> ?
>> 
>> Thanks!
>> 
>> Lee
>> 
>> On Thu, May 2, 2024 at 1:00 PM Paul Koning > > wrote:
>> 
>> 
>> > On May 2, 2024, at 3:50 PM, Lee Courtney via cctalk > > > wrote:
>> > 
>> > The first "professional software" I wrote (almost) out of University in
>> > 1979 was a package to emulate the mainframe APL\Plus file primitives on a
>> > CP/M APL variant. Used to facilitate porting of mainframe APL applications
>> > to microcomputers.
>> > 
>> > I'm still an APL adherent since the late 1960s, but it was probably too
>> > heavy-weight, with obstacles noted elsewhere (character-set, radical
>> > programming paradigm), to be successful in the early days of
>> > microcomputing. Although the MCM-70 was an amazing feat of technology.
>> > 
>> > Too bad because the language itself lends itself to learning by anyone with
>> > an understanding of high school algebra.
>> 
>> The one professional application APL I heard of was in a talk by Ron Rivest, 
>> at DEC around 1982 or so.  He described a custom chip he had built, a bignum 
>> ALU (512 bits) to do RSA acceleration.  The chip included a chunk of 
>> microcode, and he mentioned that the microcode store layout was done by an 
>> APL program about 500 lines long.  That raised some eyebrows...
>> 
>> Unless I lost it I still have the article somewhere: it's the cover story on 
>> the inaugural issue of "Lambda" which later became "VLSI Design", a 
>> technical journal about chip design.
>> 
>> My own exposure to APL started around 1998, when I decoded to try to use it 
>> for writing cryptanalysis software.  That was for a course in cryptanalysis 
>> taught by Alex Biryukov at Technion and offered to remote students.  The 
>> particular exercise was solving an ADVFX cipher (see "The Code Breakers", 
>> the unabridged hardcover, not the useless paperback).  It worked too, and it 
>> took less than 100 lines.
>> 
>> paul
>> 
>> 
>> 
>> 
>> -- 
>> Lee Courtney
>> +1-650-704-3934 cell
> 



[cctalk] Re: APL (Was: BASIC

2024-05-02 Thread Paul Koning via cctalk
Yes, it sure is.  I was mistaken about it being the first issue.  Instead, the 
RSA article appears in Vol. 1 No. 3 (4Q80).  Too bad the article itself isn't 
included in the scanned material.

paul

> On May 2, 2024, at 8:39 PM, Lee Courtney  wrote:
> 
> Paul,
> 
> Is this the Lambda/VLSI Design magazine you refer to:
> 
> Lynn Conway's VLSI Archive: Main Links (umich.edu) 
> 
> 
> ?
> 
> Thanks!
> 
> Lee
> 
> On Thu, May 2, 2024 at 1:00 PM Paul Koning  > wrote:
> 
> 
> > On May 2, 2024, at 3:50 PM, Lee Courtney via cctalk  > > wrote:
> > 
> > The first "professional software" I wrote (almost) out of University in
> > 1979 was a package to emulate the mainframe APL\Plus file primitives on a
> > CP/M APL variant. Used to facilitate porting of mainframe APL applications
> > to microcomputers.
> > 
> > I'm still an APL adherent since the late 1960s, but it was probably too
> > heavy-weight, with obstacles noted elsewhere (character-set, radical
> > programming paradigm), to be successful in the early days of
> > microcomputing. Although the MCM-70 was an amazing feat of technology.
> > 
> > Too bad because the language itself lends itself to learning by anyone with
> > an understanding of high school algebra.
> 
> The one professional application APL I heard of was in a talk by Ron Rivest, 
> at DEC around 1982 or so.  He described a custom chip he had built, a bignum 
> ALU (512 bits) to do RSA acceleration.  The chip included a chunk of 
> microcode, and he mentioned that the microcode store layout was done by an 
> APL program about 500 lines long.  That raised some eyebrows...
> 
> Unless I lost it I still have the article somewhere: it's the cover story on 
> the inaugural issue of "Lambda" which later became "VLSI Design", a technical 
> journal about chip design.
> 
> My own exposure to APL started around 1998, when I decoded to try to use it 
> for writing cryptanalysis software.  That was for a course in cryptanalysis 
> taught by Alex Biryukov at Technion and offered to remote students.  The 
> particular exercise was solving an ADVFX cipher (see "The Code Breakers", the 
> unabridged hardcover, not the useless paperback).  It worked too, and it took 
> less than 100 lines.
> 
> paul
> 
> 
> 
> 
> -- 
> Lee Courtney
> +1-650-704-3934 cell



[cctalk] Re: APL (Was: BASIC

2024-05-02 Thread Lee Courtney via cctalk
Paul,

Is this the Lambda/VLSI Design magazine you refer to:

Lynn Conway's VLSI Archive: Main Links (umich.edu)


?

Thanks!

Lee

On Thu, May 2, 2024 at 1:00 PM Paul Koning  wrote:

>
>
> > On May 2, 2024, at 3:50 PM, Lee Courtney via cctalk <
> cctalk@classiccmp.org> wrote:
> >
> > The first "professional software" I wrote (almost) out of University in
> > 1979 was a package to emulate the mainframe APL\Plus file primitives on a
> > CP/M APL variant. Used to facilitate porting of mainframe APL
> applications
> > to microcomputers.
> >
> > I'm still an APL adherent since the late 1960s, but it was probably too
> > heavy-weight, with obstacles noted elsewhere (character-set, radical
> > programming paradigm), to be successful in the early days of
> > microcomputing. Although the MCM-70 was an amazing feat of technology.
> >
> > Too bad because the language itself lends itself to learning by anyone
> with
> > an understanding of high school algebra.
>
> The one professional application APL I heard of was in a talk by Ron
> Rivest, at DEC around 1982 or so.  He described a custom chip he had built,
> a bignum ALU (512 bits) to do RSA acceleration.  The chip included a chunk
> of microcode, and he mentioned that the microcode store layout was done by
> an APL program about 500 lines long.  That raised some eyebrows...
>
> Unless I lost it I still have the article somewhere: it's the cover story
> on the inaugural issue of "Lambda" which later became "VLSI Design", a
> technical journal about chip design.
>
> My own exposure to APL started around 1998, when I decoded to try to use
> it for writing cryptanalysis software.  That was for a course in
> cryptanalysis taught by Alex Biryukov at Technion and offered to remote
> students.  The particular exercise was solving an ADVFX cipher (see "The
> Code Breakers", the unabridged hardcover, not the useless paperback).  It
> worked too, and it took less than 100 lines.
>
> paul
>
>
>

-- 
Lee Courtney
+1-650-704-3934 cell


[cctalk] Re: APL (Was: BASIC

2024-05-02 Thread ben via cctalk

On 2024-05-02 4:55 a.m., Liam Proven via cctalk wrote:

On Thu, 2 May 2024 at 00:51, Fred Cisin via cctalk
 wrote:


What would our world be like if the first home computers were to have had
APL, instead of BASIC?


To be perfectly honest I think the home computer boom wouldn't have
happened, and it would have crashed and burned in the 1970s, with the
result that microcomputers remained firmly under corporate control.

I have been watching the APL world with interest since I discovered it
at university, and I still don't understand a word of it.

I've been watching Lisp for just 15 years or so and I find it unreadable too.

I think there are widely different levels of mental flexibility among
smart humans and one person's "this just requires a small effort but
you get so much in return!" is someone else's eternally impossible,
unclimbable mountain.

After some 40 years in computers now, I still like BASIC best, with
Fortran and Pascal very distant runners-up and everything else from C
to Python is basically somewhere between Minoan Linear A and Linear B
to me.

I think I lack the mental flexibility, and I think I'm better than
most of hoi polloi.

If the early machines had used something cryptic like APL or Forth I
reckon we'd never have had a generation of child programmers.


I have very poor memory, IF,REM,LET ect I can remember.
Line noise like TELCO err APL I can not make sense at all.
 USA(IBM) pushed APL , Europe wanted ALGOL. What users got was
STUPID ASCII and the useless accent marks. Without real IO
lots of languages died, and we got C and Pascal but only for
the US. That just left BASIC the standard as it just needed
A-Z0-9[]+-=><;"

BASIC would be still around in ALT UNIVERSE running off the
cloud.





[cctalk] Re: APL (Was: BASIC

2024-05-02 Thread Paul Koning via cctalk



> On May 2, 2024, at 3:50 PM, Lee Courtney via cctalk  
> wrote:
> 
> The first "professional software" I wrote (almost) out of University in
> 1979 was a package to emulate the mainframe APL\Plus file primitives on a
> CP/M APL variant. Used to facilitate porting of mainframe APL applications
> to microcomputers.
> 
> I'm still an APL adherent since the late 1960s, but it was probably too
> heavy-weight, with obstacles noted elsewhere (character-set, radical
> programming paradigm), to be successful in the early days of
> microcomputing. Although the MCM-70 was an amazing feat of technology.
> 
> Too bad because the language itself lends itself to learning by anyone with
> an understanding of high school algebra.

The one professional application APL I heard of was in a talk by Ron Rivest, at 
DEC around 1982 or so.  He described a custom chip he had built, a bignum ALU 
(512 bits) to do RSA acceleration.  The chip included a chunk of microcode, and 
he mentioned that the microcode store layout was done by an APL program about 
500 lines long.  That raised some eyebrows...

Unless I lost it I still have the article somewhere: it's the cover story on 
the inaugural issue of "Lambda" which later became "VLSI Design", a technical 
journal about chip design.

My own exposure to APL started around 1998, when I decoded to try to use it for 
writing cryptanalysis software.  That was for a course in cryptanalysis taught 
by Alex Biryukov at Technion and offered to remote students.  The particular 
exercise was solving an ADVFX cipher (see "The Code Breakers", the unabridged 
hardcover, not the useless paperback).  It worked too, and it took less than 
100 lines.

paul




[cctalk] Re: APL (Was: BASIC

2024-05-02 Thread Lee Courtney via cctalk
The first "professional software" I wrote (almost) out of University in
1979 was a package to emulate the mainframe APL\Plus file primitives on a
CP/M APL variant. Used to facilitate porting of mainframe APL applications
to microcomputers.

I'm still an APL adherent since the late 1960s, but it was probably too
heavy-weight, with obstacles noted elsewhere (character-set, radical
programming paradigm), to be successful in the early days of
microcomputing. Although the MCM-70 was an amazing feat of technology.

Too bad because the language itself lends itself to learning by anyone with
an understanding of high school algebra. Iverson et al started scratching
the surface of introducing computing to elementary/high-school students
using APL with success in the 1970s. I myself learned it as my first
programming language by just reading a book and hacking while in 7th grade.
Further info:
THE USE OF APL IN TEACHING — Software Preservation Group

INTRODUCING APL TO TEACHERS — Software Preservation Group

APL in Exposition — Software Preservation Group


Lee Courtney

On Thu, May 2, 2024 at 3:56 AM Liam Proven via cctalk 
wrote:

> On Thu, 2 May 2024 at 00:51, Fred Cisin via cctalk
>  wrote:
> >
> > What would our world be like if the first home computers were to have had
> > APL, instead of BASIC?
>
> To be perfectly honest I think the home computer boom wouldn't have
> happened, and it would have crashed and burned in the 1970s, with the
> result that microcomputers remained firmly under corporate control.
>
> I have been watching the APL world with interest since I discovered it
> at university, and I still don't understand a word of it.
>
> I've been watching Lisp for just 15 years or so and I find it unreadable
> too.
>
> I think there are widely different levels of mental flexibility among
> smart humans and one person's "this just requires a small effort but
> you get so much in return!" is someone else's eternally impossible,
> unclimbable mountain.
>
> After some 40 years in computers now, I still like BASIC best, with
> Fortran and Pascal very distant runners-up and everything else from C
> to Python is basically somewhere between Minoan Linear A and Linear B
> to me.
>
> I think I lack the mental flexibility, and I think I'm better than
> most of hoi polloi.
>
> If the early machines had used something cryptic like APL or Forth I
> reckon we'd never have had a generation of child programmers.
>
> --
> Liam Proven ~ Profile: https://about.me/liamproven
> Email: lpro...@cix.co.uk ~ gMail/gTalk/FB: lpro...@gmail.com
> Twitter/LinkedIn: lproven ~ Skype: liamproven
> IoM: (+44) 7624 277612: UK: (+44) 7939-087884
> Czech [+ WhatsApp/Telegram/Signal]: (+420) 702-829-053
>


-- 
Lee Courtney
+1-650-704-3934 cell


[cctalk] Re: APL (Was: BASIC

2024-05-02 Thread Norman Jaffe via cctalk
I was lucky enough to have worked initially in Focal and FORTRAN at UW 
(Seattle) and moved on to PL/I, Pascal and APL at SFU (Burnaby, B.C.) while 
being exposed to Algol, BASIC, C, GPSS, Smalltalk, Simula, SNOBOL4, XPL and 
many other 'esoteric' languages. 
Of course, various Assemblers were in the 'mix' - 6800/6809, 8080, Z80, IBM 
360/370, IBM 1800, M68K... it's helped me adapt to whatever environment that I 
wound up working in. 

From: "Johan Helsingius via cctalk"  
To: "cctalk"  
Cc: "Johan Helsingius"  
Sent: Thursday, May 2, 2024 8:03:38 AM 
Subject: [cctalk] Re: APL (Was: BASIC 

On 02/05/2024 01:51, Fred Cisin via cctalk wrote: 
> What would our world be like if the first home computers were to have had 
> APL, instead of BASIC? 

I don't know, but if you had asked "What would our world be like if the 
first home computers were to have had SmallTalk or even ALGOL instead of 
BASIC?" I would have said "much better". 

I started out with FORTRAN and 6800 assembler, but my first real 
programming job was in BASIC. I am fortunate in that they thought 
me Pascal in university, and I then got exposed to a bunch of other 
real high level languages - if I hadn't, and had continued with 
BASIC, I would probably have ended up as a pretty crap programmer. 

Julf 


[cctalk] Re: APL (Was: BASIC

2024-05-02 Thread Jon Elson via cctalk

On 5/2/24 05:55, Liam Proven via cctalk wrote:

On Thu, 2 May 2024 at 00:51, Fred Cisin via cctalk
 wrote:

What would our world be like if the first home computers were to have had
APL, instead of BASIC?

To be perfectly honest I think the home computer boom wouldn't have
happened, and it would have crashed and burned in the 1970s, with the
result that microcomputers remained firmly under corporate control.


Well, I have my doubts.  I did run ONE program in BASIC on 
my home Z-80 system, that was a VERY crude simulation of an 
electric car.  I had an S-100 Z-80 system from about 1976 
with paper tape. I then got a floppy drive, and later a 
Memorex 10 MB Winchester drive.  I did a LOT of programming 
in assembly language but also did larger programs like cross 
assemblers in Pascal.


Yes, I know a lot of people programmed in BASIC, but I 
didn't find it very good.


Jon



[cctalk] Re: APL (Was: BASIC

2024-05-02 Thread Johan Helsingius via cctalk

On 02/05/2024 01:51, Fred Cisin via cctalk wrote:

What would our world be like if the first home computers were to have had
APL, instead of BASIC?


I don't know, but if you had asked "What would our world be like if the
first home computers were to have had SmallTalk or even ALGOL instead of
BASIC?" I would have said "much better".

I started out with FORTRAN and 6800 assembler, but my first real
programming job was in BASIC. I am fortunate in that they thought
me Pascal in university, and I then got exposed to a bunch of other
real high level languages - if I hadn't, and had continued with
BASIC, I would probably have ended up as a pretty crap programmer.

Julf



[cctalk] Re: APL (Was: BASIC

2024-05-02 Thread Chuck Guzis via cctalk
On 5/2/24 07:02, Paul Koning via cctalk wrote:
> 

> My guess is that the languages you use routinely are the ones that work best, 
> and which languages those are depends on where you work and on what projects. 
>  For example, I don't *like* C (I call it a "feebly typed language") and C++ 
> not either, but my job uses these two plus Python.
> 
> Now Python is actually my favorite (though recently I've done a bunch of work 
> in FORTH).  I like to mention that, in 50 years or so, I have only 
> encountered two programming languages where I went from "no knowledge" to 
> "wrote and debugged a substantial program" in only one week -- Pascal (in 
> graduate school) and Python (one job ago).

Reminds me of the brief trend for so-called "Natural Language"
programming in the  70s.  NLP for short.

"Take the 6rh item on the list and print it"  sort of stuff. Apparently,
it has resurfaced in the AI community.  Problem is that people don't
think like computers, even AI-equipped ones.

--Chuck




[cctalk] Re: APL (Was: BASIC

2024-05-02 Thread Paul Koning via cctalk



> On May 2, 2024, at 6:55 AM, Liam Proven via cctalk  
> wrote:
> 
> On Thu, 2 May 2024 at 00:51, Fred Cisin via cctalk
>  wrote:
>> 
>> What would our world be like if the first home computers were to have had
>> APL, instead of BASIC?
> 
> To be perfectly honest I think the home computer boom wouldn't have
> happened, and it would have crashed and burned in the 1970s, with the
> result that microcomputers remained firmly under corporate control.
> 
> I have been watching the APL world with interest since I discovered it
> at university, and I still don't understand a word of it.
> 
> I've been watching Lisp for just 15 years or so and I find it unreadable too.
> 
> I think there are widely different levels of mental flexibility among
> smart humans and one person's "this just requires a small effort but
> you get so much in return!" is someone else's eternally impossible,
> unclimbable mountain.

That sounds right to me.

> After some 40 years in computers now, I still like BASIC best, with
> Fortran and Pascal very distant runners-up and everything else from C
> to Python is basically somewhere between Minoan Linear A and Linear B
> to me.

Well, Linear B isn't that hard, it's just Greek.  :-)

My guess is that the languages you use routinely are the ones that work best, 
and which languages those are depends on where you work and on what projects.  
For example, I don't *like* C (I call it a "feebly typed language") and C++ not 
either, but my job uses these two plus Python.

Now Python is actually my favorite (though recently I've done a bunch of work 
in FORTH).  I like to mention that, in 50 years or so, I have only encountered 
two programming languages where I went from "no knowledge" to "wrote and 
debugged a substantial program" in only one week -- Pascal (in graduate school) 
and Python (one job ago).

paul



[cctalk] Re: APL (Was: BASIC

2024-05-02 Thread Murray McCullough via cctalk
I’m not certain what constitutes the original foundations of
BASIC(Beginner’s All-Purpose Symbolic Instruction Code) but to my knowledge
it began with J. G. Kemeny and T. E. Kurtz at Dartmouth College in 1964.
Apple BASIC and GWBASIC were well established when I began experimenting
with them in early 1980’s. By mid-80’s I was running both on a PC and
Coleco ADAM. I wrote a program using GWBASIC for cataloging my books and
magazines.

Happy computing,

Murray 

On Thu, May 2, 2024 at 6:56 AM Liam Proven via cctalk 
wrote:

> On Thu, 2 May 2024 at 00:51, Fred Cisin via cctalk
>  wrote:
> >
> > What would our world be like if the first home computers were to have had
> > APL, instead of BASIC?
>
> To be perfectly honest I think the home computer boom wouldn't have
> happened, and it would have crashed and burned in the 1970s, with the
> result that microcomputers remained firmly under corporate control.
>
> I have been watching the APL world with interest since I discovered it
> at university, and I still don't understand a word of it.
>
> I've been watching Lisp for just 15 years or so and I find it unreadable
> too.
>
> I think there are widely different levels of mental flexibility among
> smart humans and one person's "this just requires a small effort but
> you get so much in return!" is someone else's eternally impossible,
> unclimbable mountain.
>
> After some 40 years in computers now, I still like BASIC best, with
> Fortran and Pascal very distant runners-up and everything else from C
> to Python is basically somewhere between Minoan Linear A and Linear B
> to me.
>
> I think I lack the mental flexibility, and I think I'm better than
> most of hoi polloi.
>
> If the early machines had used something cryptic like APL or Forth I
> reckon we'd never have had a generation of child programmers.
>
> --
> Liam Proven ~ Profile: https://about.me/liamproven
> Email: lpro...@cix.co.uk ~ gMail/gTalk/FB: lpro...@gmail.com
> Twitter/LinkedIn: lproven ~ Skype: liamproven
> IoM: (+44) 7624 277612: UK: (+44) 7939-087884
> Czech [+ WhatsApp/Telegram/Signal]: (+420) 702-829-053
>


[cctalk] Re: APL (Was: BASIC

2024-05-02 Thread Liam Proven via cctalk
On Thu, 2 May 2024 at 00:51, Fred Cisin via cctalk
 wrote:
>
> What would our world be like if the first home computers were to have had
> APL, instead of BASIC?

To be perfectly honest I think the home computer boom wouldn't have
happened, and it would have crashed and burned in the 1970s, with the
result that microcomputers remained firmly under corporate control.

I have been watching the APL world with interest since I discovered it
at university, and I still don't understand a word of it.

I've been watching Lisp for just 15 years or so and I find it unreadable too.

I think there are widely different levels of mental flexibility among
smart humans and one person's "this just requires a small effort but
you get so much in return!" is someone else's eternally impossible,
unclimbable mountain.

After some 40 years in computers now, I still like BASIC best, with
Fortran and Pascal very distant runners-up and everything else from C
to Python is basically somewhere between Minoan Linear A and Linear B
to me.

I think I lack the mental flexibility, and I think I'm better than
most of hoi polloi.

If the early machines had used something cryptic like APL or Forth I
reckon we'd never have had a generation of child programmers.

-- 
Liam Proven ~ Profile: https://about.me/liamproven
Email: lpro...@cix.co.uk ~ gMail/gTalk/FB: lpro...@gmail.com
Twitter/LinkedIn: lproven ~ Skype: liamproven
IoM: (+44) 7624 277612: UK: (+44) 7939-087884
Czech [+ WhatsApp/Telegram/Signal]: (+420) 702-829-053


[cctalk] Re: APL (Was: BASIC

2024-05-01 Thread Bill Gunshannon via cctalk




On 5/1/2024 8:04 PM, Chuck Guzis via cctalk wrote:

On 5/1/24 16:51, Fred Cisin via cctalk wrote:


APL was incredible.  I was amazed.  I was immediately able to do a few
simple things that were useful for my boss and myself, and writing
simple programs within hours.  Its matrix arithmetic was awesome. APL
typeball on a selectric terminal at GSFC, . . .
Some of the keys were re-labeled, but there was a chart on the wall of
which keyboard characters were which APL symbols.


It was indeed.  It was also one of the first languages implemented on a
microprocessor-based personal computer system.  (MCM-70).

To me, APL is logical--strict right-to-left precedence; simple array and
matrix operations.

I've long wondered if we introduced students to APL as a first language,
what our applications code would look like today.


Marist College did.  We had an intern from there when I was at
West Point.  He was not better than any of the interns I later
ran into and because the only language he was learning at Marist
was APL (after all, this was IBM-Land) he was really not of much
use to us in a Univac-1100 shop.

bill


[cctalk] Re: APL (Was: BASIC

2024-05-01 Thread Bill Gunshannon via cctalk




On 5/1/2024 7:51 PM, Fred Cisin via cctalk wrote:


What would our world be like if the first home computers were to have 
had APL, instead of BASIC?




Maybe not instead of BASIC but I had APL on my TRS-80.

bill



[cctalk] Re: APL (Was: BASIC

2024-05-01 Thread Fred Cisin via cctalk

On Wed, 1 May 2024, Mike Katz wrote:
I remember replacing the character generator eprom (the type with the window 
for UV erasing) on an old ATI EGA video board so that I could have the APL 
character set.


sweet

At least one of the ATI EGA boards had a daughter board available to be 
able to use it in Compaq luggables.  (Compaq CGA, Compaq EGA, ATI EGA with 
daughter board)


--
Grumpy Ol' Fred ci...@xenosoft.com


[cctalk] Re: APL (Was: BASIC

2024-05-01 Thread Mike Katz via cctalk
I remember replacing the character generator eprom (the type with the 
window for UV erasing) on an old ATI EGA video board so that I could 
have the APL character set.


On 5/1/2024 7:14 PM, Fred Cisin via cctalk wrote:

APL was incredible.  I was amazed.  I was immediately able to do a few
simple things that were useful for my boss and myself, and writing
simple programs within hours.  Its matrix arithmetic was awesome. APL
typeball on a selectric terminal at GSFC, . . .
Some of the keys were re-labeled, but there was a chart on the wall of
which keyboard characters were which APL symbols.


On Wed, 1 May 2024, Chuck Guzis via cctalk wrote:

It was indeed.  It was also one of the first languages implemented on a
microprocessor-based personal computer system.  (MCM-70).
To me, APL is logical--strict right-to-left precedence; simple array and
matrix operations.
I've long wondered if we introduced students to APL as a first language,
what our applications code would look like today.
My friend Bruce, called it "That Iverson Language".
It's interesting to note that the Iverson book was published in 1962,
but an implementation (under 7090 IBSYS) didn't come about until 1965,
although preliminary implementation as PAT had been done on a 1620 (!)
in 1963.


The extended character set was an important obstacle to its 
acceptance. Besides keyboard (masking tape) and output (APL typeball, 
special character generator, or having to substitute combinations of 
character), many people were unwilling to even try something with a 
different character set.


--
Grumpy Ol' Fred ci...@xenosoft.com




[cctalk] Re: APL (Was: BASIC

2024-05-01 Thread Fred Cisin via cctalk

APL was incredible.  I was amazed.  I was immediately able to do a few
simple things that were useful for my boss and myself, and writing
simple programs within hours.  Its matrix arithmetic was awesome. APL
typeball on a selectric terminal at GSFC, . . .
Some of the keys were re-labeled, but there was a chart on the wall of
which keyboard characters were which APL symbols.


On Wed, 1 May 2024, Chuck Guzis via cctalk wrote:

It was indeed.  It was also one of the first languages implemented on a
microprocessor-based personal computer system.  (MCM-70).
To me, APL is logical--strict right-to-left precedence; simple array and
matrix operations.
I've long wondered if we introduced students to APL as a first language,
what our applications code would look like today.
My friend Bruce, called it "That Iverson Language".
It's interesting to note that the Iverson book was published in 1962,
but an implementation (under 7090 IBSYS) didn't come about until 1965,
although preliminary implementation as PAT had been done on a 1620 (!)
in 1963.


The extended character set was an important obstacle to its acceptance. 
Besides keyboard (masking tape) and output (APL typeball, special 
character generator, or having to substitute combinations of character), 
many people were unwilling to even try something with a different 
character set.


--
Grumpy Ol' Fred ci...@xenosoft.com

[cctalk] Re: APL (Was: BASIC

2024-05-01 Thread Chuck Guzis via cctalk
On 5/1/24 16:51, Fred Cisin via cctalk wrote:

> APL was incredible.  I was amazed.  I was immediately able to do a few
> simple things that were useful for my boss and myself, and writing
> simple programs within hours.  Its matrix arithmetic was awesome. APL
> typeball on a selectric terminal at GSFC, . . .
> Some of the keys were re-labeled, but there was a chart on the wall of
> which keyboard characters were which APL symbols.

It was indeed.  It was also one of the first languages implemented on a
microprocessor-based personal computer system.  (MCM-70).

To me, APL is logical--strict right-to-left precedence; simple array and
matrix operations.

I've long wondered if we introduced students to APL as a first language,
what our applications code would look like today.

My friend Bruce, called it "That Iverson Language".

It's interesting to note that the Iverson book was published in 1962,
but an implementation (under 7090 IBSYS) didn't come about until 1965,
although preliminary implementation as PAT had been done on a 1620 (!)
in 1963.

--Chuck




[cctalk] Re: APL (Was: BASIC

2024-05-01 Thread Fred Cisin via cctalk

On Wed, 1 May 2024, Chuck Guzis via cctalk wrote:

To be sure, BASIC was hardly unique in terms of the 1960s interactive
programming languages.  We had JOSS, PILOT, IITRAN and a host of others,
based on FORTRAN-ish syntax. not to forget APL, which was a thing apart.


What would our world be like if the first home computers were to have had 
APL, instead of BASIC?



APL was incredible.  I was amazed.  I was immediately able to do a few 
simple things that were useful for my boss and myself, and writing simple 
programs within hours.  Its matrix arithmetic was awesome. 
APL typeball on a selectric terminal at GSFC, . . .
Some of the keys were re-labeled, but there was a chart on the wall of 
which keyboard characters were which APL symbols.



My cousin (David Ungar) referred to APL as "terse".  He said that you 
could write a word processing program in a single line, but that was well 
past my abilities.


--
Grumpy Ol' Fred ci...@xenosoft.com