>
> No, using "<--" is going in the wrong direction.  We want notation, not
> ASCII soup.

This distinction between notation and soup seems pretty subjective. What is
the difference between soup and notation? In my mind it has a lot to do
with familiarity. I watched that video about programming Conway's Game of
Life in APL and it looks like an incomprehensible soup of symbols to me.

Another example of ASCII soup is regex.

That's interesting, I feel the same way. I can read most code pretty
quickly, but as soon as I hit a regex it takes me 50x as long to read and I
have to crack open a reference because I can never remember the notation.
Luckily someone came up with a solution called verbal expressions
<https://github.com/VerbalExpressions/PythonVerbalExpressions> which trade
hard-to-remember symbols with easy to understand words! (though I think the
Python implementation smacks of Java idioms)

I'm sure there are people who work with regular expressions on such a
regular basis that they've become fluent, but when you require such deep
emersion in the language before the symbols make sense to you, it's a huge
barrier to entry. You can't then act all confused about why your favorite
language never caught on.

Without real notation one introduces a huge cognitive load.

What is "real notation". This sounds like a no-true-Scotsman fallacy
<https://en.wikipedia.org/wiki/No_true_Scotsman>. Has everyone on this
message board been communicating with fake ASCII notation this entire time?
Cognative load can come from many different places like:

   1. Having to remember complex key combinations just to get your thoughts
   into code
   2. Having to memorize what each of thousands of symbols do because
   there's no way to look them up in a search engine
   3. Knowing no other notation system that even slightly resembles APL.
   I mean, I know some esoteric mathematics, but I've never seen anything
   that looks even remotely like:
   life←{↑1 ⍵∨.∧3 4=+/,¯1 0 1∘.⊖¯1 0 1∘.⌽⊂⍵}

A big part of Python's philosophy is that you read code way more often than
you write code so we should optimize readability. As one Reddit commentor
put it
<https://www.reddit.com/r/programming/comments/1vlbhx/sudoku_solving_program_in_apl_that_blows_my_mind/>
:

APL is such a powerful language. APL is also a powerfully write-only
> language.


And I don't even fully agree there because it somehow manages to be almost
as difficult to write.

Typing these symbols isn't a problem at all.  For example, in NARS2000, a
> free APL interpreter I use, the assignment operator "←" is entered simply
> with "Alt + [".  It takes seconds to internalize this and never think about
> it again.


For some people. I, myself; have a learning disability and often need to
look at my keyboard. The relationship between "←" and "[" doesn't seem
obvious at all.

If you download NARS2000 right now you will know how to enter "←"
> immediately because I just told you how to do it.  You will also know
> exactly what it does.  It's that simple.


You know what's even simpler and requires even less cognitive load?  Typing
ASCII characters...

The other interesting thing about notation is that it transcends language.


The word "notation" refers to symbols, abbreviations, and short-hand that
make up domain-specific languages. Nothing about notation "transcends"
language, notation is a component of language. Math is the study of
patterns. Mathematical notation is what we use to write the language of
patterns, to describe different patterns and communicate ideas about
patterns. There used to be different mathematical languages based on
culture, just like spoken languages. There's no magical property that made
Roman numerals or Arabic numerals just make sense to people from other
cultures, they had to learn each others notation just like any other
language and eventually settled on Arabic numerals. Maybe things would have
gone differently if the Mayans had a say.

It has been my experience that people who have not had the experience
> rarely get it


A pattern I've seen in my experience is that some person or group will put
forth a pretty good idea, and others become dogmatic about that idea, loose
sight of pragmatism, and try to push the idea beyond its practical
applicability. I'm not saying this is you. I haven't yet read the Ken
Iverson paper (I will). My suspicion at this point and after seeing the APL
code demos is that there's probably plenty of good ideas in there, but APL
doesn't strike me as pragmatic in any sense.

On Wed, Nov 6, 2019 at 11:06 AM Martin Euredjian via Python-ideas <
python-ideas@python.org> wrote:

> Thanks for your feedback.  A few comments:
>
> > I do not consider these two things conceptually equivalent. In Python
> the identifier ('a' in this case) is just label to the value
>
> I used APL professionally for about ten years.  None of your objections
> ring true.  A simple example is had from mathematics.  The integral symbol
> conveys and represents a concept.  Once the practitioner is introduced to
> the definition of that symbol, what it means, he or she uses it.  It really
> is a simple as that, this is how our brains work.  That's how you recognize
> the letter "A" as to correspond to a sound and as part of words.  This is
> how, in languages such as Chinese, symbols, notation, are connected to
> meaning.  It is powerful and extremely effective.
>
> The use of notation as a tool for thought is a powerful concept that
> transcends programming.  Mathematics is a simple example. So is music.
> Musical notation allows the expression of ideas and massively complex works
> as well as their creation.  In electronics we have circuit diagrams, which
> are not literal depictions of circuits but rather a notation to represent
> them, to think about them, to invent them.
>
> The future of computing, in my opinion, must move away --perhaps not
> entirely-- from ASCII-based typing of words.  If we want to be able to
> express and think about programming at a higher level we need to develop a
> notation.  As AI and ML evolve this might become more and more critical.
>
> APL, sadly, was too early.  Machines of the day were literally inadequate
> in almost every respect.  It is amazing that the language went as far as it
> did.  Over 30+ years I have worked with over a dozen languages, ranging
> from low level machine code through Forth, APL, Lisp, C, C++, Objective-C,
> and all the "modern" languages such as Python, JS, PHP, etc.  Programming
> with APL is a very different experience.  Your mind works differently.  I
> can only equate it to writing orchestral scores in the sense that the
> symbols represent very complex textures and structures that your mind
> learns to imagine and manipulate in real time.  You think about spinning,
> crunching, slicing and manipulating data structures in ways you never rally
> think about when using any other language.  Watch the videos I link to
> below for a taste of these ideas.
>
> Anyhow, obviously the walrus operator is here to stay.  I am not going to
> change anything.  I personally think this is sad and a wasted opportunity
> to open a potentially interesting chapter in the Python story; the mild
> introduction of notation and a path towards evolving a richer notation over
> time.
>
> > Second point, I can write := in two keystrokes, but I do not have a
> dedicated key for the arrow on my keyboard. Should '<--' also be an
> acceptable syntax?
>
> No, using "<--" is going in the wrong direction.  We want notation, not
> ASCII soup.  One could argue even walrus is ASCII soup.  Another example of
> ASCII soup is regex.  Without real notation one introduces a huge cognitive
> load.  Notation makes a massive difference.  Any classically trained
> musician sees this instantly.  If we replaced musical notation with
> sequences of two or three ASCII characters it would become an
> incomprehensible mess.
>
> Typing these symbols isn't a problem at all.  For example, in NARS2000, a
> free APL interpreter I use, the assignment operator "←" is entered simply
> with "Alt + [".  It takes seconds to internalize this and never think about
> it again.  If you download NARS2000 right now you will know how to enter "
> ←" immediately because I just told you how to do it.  You will also know
> exactly what it does.  It's that simple.
>
> The other interesting thing about notation is that it transcends
> language.  So far all conventional programming languages have been rooted
> in English.  I would argue there is no need for this when a programming
> notation, just like mathematical and musical notations have demonstrated
> that they transcend spoken languages.  Notation isn't just a tool for
> thought, it adds a universal element that is impossible to achieve in any
> other way.
>
>
> Anyhow, again, I am not going to change a thing.  I am nobody in the
> Python world.  Just thought it would be interesting to share this
> perspective because I truly think this was a missed opportunity.  If
> elegance is of any importance, having two assignment operators when one can
> do the job, as well as evolve the language in the direction of an exciting
> and interesting new path is, at the very least, inelegant.  I can only
> ascribe this to very few people involved in this process, if any, any real
> experience with APL.  One has to use APL for real work and for at least a
> year or two in order for your brain to make the mental switch necessary to
> understand it.  Just messing with it casually isn't good enough.  Lots of
> inquisitive people have messed with it, but they don't really understand it.
>
>
> I encourage everyone to read this Turing Award presentation:
>
> "Notation as a Tool of Thought" by Ken Iverson, creator of APL
> http://www.eecg.toronto.edu/~jzhu/csc326/readings/iverson.pdf
>
>
> Also, if you haven't seen it, these videos is very much worth watching:
>
> Conway's Game of Life in APL
> https://www.youtube.com/watch?v=a9xAKttWgP4
>
> Suduku solver in APL
> https://www.youtube.com/watch?v=DmT80OseAGs
>
>
> -Martin
>
>
>
> On Tuesday, November 5, 2019, 11:54:45 PM PST, Richard Musil <
> risa20...@gmail.com> wrote:
>
>
> On Wed, Nov 6, 2019 at 5:32 AM martin_05--- via Python-ideas <
> python-ideas@python.org> wrote:
>
> In other words, these two things would have been equivalent in Python:
>
>     a ← 23
>
>     a = 23
>
>
> I do not consider these two things conceptually equivalent. In Python the
> identifier ('a' in this case) is just label to the value, I can imagine
> "let 'a' point to the value of 23 now" and write it this way: "a --> 23",
> but "a <-- 23" does give an impression that 23 points to, or is somehow fed
> into, 'a'. This may give false expectations to those who are coming to
> Python from another language and might expect the "l-value" behavior in
> Python.
>
> Second point, I can write := in two keystrokes, but I do not have a
> dedicated key for the arrow on my keyboard. Should '<--' also be an
> acceptable syntax?
>
> Richard
> _______________________________________________
> Python-ideas mailing list -- python-ideas@python.org
> To unsubscribe send an email to python-ideas-le...@python.org
> https://mail.python.org/mailman3/lists/python-ideas.python.org/
> Message archived at
> https://mail.python.org/archives/list/python-ideas@python.org/message/VAGY7OV7UJCREV5WG2OFMGPTUPGYTNB7/
> Code of Conduct: http://python.org/psf/codeofconduct/
>
_______________________________________________
Python-ideas mailing list -- python-ideas@python.org
To unsubscribe send an email to python-ideas-le...@python.org
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at 
https://mail.python.org/archives/list/python-ideas@python.org/message/XY75QP3X5QT2DEYGV4BAIDIZWR6SQJNW/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to