Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-08 Thread Lie Ryan

On 03/30/2012 06:25 AM, Steve Howell wrote:

On Mar 29, 11:53 am, Devin Jeanpierrejeanpierr...@gmail.com  wrote:


Well, what sort of language differences make for English vs Mandarin?
Relational algebraic-style programming is useful, but definitely a
large language barrier to people that don't know any SQL. I think this
is reasonable. (It would not matter even if you gave SQL python-like
syntax, the mode of thinking is different, and for a good reason.)



I don't see any fundamental disconnect between SQL thinking and Python
thinking.

List comprehensions are very close to SQL SELECTs semantically, and
not that far off syntactically.

   [row.x for row in foo if x == 3]

   select x from foo where x = 3


which is where most people get it wrong; the SQL SELECTs statement did 
not specify how the machine is going to get its answer, while the 
Python's list comprehension explicitly specify that the machine is going 
to loop over foo. In most implementation of SQL with the proper indexes 
set up, the SELECT statement above will most likely just use its index 
to avoid looping over the whole foo, and the most smartest ones might 
notice that the result query only ever contains 3 and so just use the 
count of the index (I don't know if any existing SQL engine is *that* 
smart though).


--
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-05 Thread Nathan Rice
Re-trolling.

On Wed, Apr 4, 2012 at 1:49 AM, Steven D'Aprano
steve+comp.lang.pyt...@pearwood.info wrote:
 As part of my troll-outreach effort, I will indulge here.  I was
 specifically thinking about some earlier claims that programming
 languages as they currently exist are somehow inherently superior to a
 formalized natural language in expressive power.

 I would argue that they are, but only for the very limited purpose for
 which they are written. With the possible exception of Inform 7, most
 programming languages are useless at describing (say) human interactions.

I was thinking about this statement this morning.  Compression is just
the process of finding a more efficient encoding for information.  I
suppose you could say then that language developers could
theoretically be trying to compress natural language representations
of information.  The problem then is that everyone is doing a horrible
job, because they are approaching the subject in an ad hoc manner.
There are multiple branches of mathematics and computer science that
deal with this exact subject in a rigorous way.  The trick is to find
an encoding that has low space complexity, and for which the
transformation to knowledge is efficient, for human beings.

Lets assume that the input to be encoded are logical
(proposition/predicate) statements. The first thing that came to mind
when thinking this way is radix trees and directed acyclic word graphs
(a form of DFA).  These structures are fairly easy to work out on
paper given a set of inputs, and it is fairly easy to reconstruct a
set of inputs from the structure.  Perhaps, we could use natural
language statements, and some very minimal extended syntax to indicate
a data structure (which fans out to a set of statements).  As a quick
example to see what I mean (mimicking some python syntax for
similarity):

in the context of chess:

a color is either white or black

the board:
is a cartesian grid having dimension (8, 8)
has squares, representing points on the grid

a square:
has a color
contains a piece or is empty

a piece:
has a color
is located in a square or has been captured

a { king, queen, rook, bishop, knight, pawn } is a type of piece

It should be clear that this is a directed acyclic phrase graph, and
if you select a phrase fragment, then one phrase fragment from each
child level until reaching a leaf, the concatenation of the phrase
fragments forms a logical phrase.  Note that the set braces are
shorthand for multiple statements.  This was really easy to write, and
I bet even non programmers would have little or no trouble
understanding what was going on.  Additionally, I could make a full
statement elsewhere, and if we have an algorithm to transform to a
canonical phrase structure and merge synonyms, it could be inserted in
the phrase graph, just as neatly as if I had written it there in the
first place.  The sexy thing about that, is that lets you take two
sets of propositional statements, and perform set theoretic operations
on them (union, complement, etc), and get a phrase graph structure out
at the end which looks just like a nice neat little program.  You
could even get really crazy, if you could define equivalence relations
(other than the natural relation) for the union (Set1.A ~ Set2.B) as
that would let you compose the graphs in arbitrarily many ways.  If
you're dealing processes, you would also want to be able to specify
temporal equivalence (Process1.T1 ~ Process2.T6).
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-04 Thread Steven D'Aprano
On Tue, 03 Apr 2012 08:39:14 -0400, Nathan Rice wrote:

 Much like
 with the terminal to GUI transition, you will have people attacking
 declarative natural language programming as a stupid practice for noobs,
 and the end of computing (even though it will allow people with much
 less experience to be more productive than them).

I cry every time I consider GUI programming these days.

In the late 1980s and early 1990s, Apple released a product, Hypercard, 
that was a combination GUI framework and natural-ish language programming 
language. It was an astonishing hit with non-programmers, as it allowed 
people to easily move up from point and click programming to real 
programming as their skills improved.

Alas, it has been abandoned by Apple, and while a few of its intellectual 
successors still exit, it very niche. 

I *really* miss Hypercard. Not so much for the natural language syntax, 
as for the astonishingly simple and obvious GUI framework.

To get a flavour of the syntax, see OpenXION:

http://www.openxion.org

and for a hint of the framework, see Pythoncard:

http://pythoncard.sourceforge.net


 Ultimately, the answers to your questions exist in the world for you to
 see.  How does a surgeon describe a surgical procedure?  How does a chef
 describe a recipe?  How does a carpenter describe the process of
 building cabinets?  Aside from specific words, they all use natural
 language, and it works just fine.

No they don't. In general they don't use written language at all, but 
when they are forced to, they use a combination of drawings or 
illustrations plus a subset of natural language plus specialist jargon.

Programming languages include both specialist grammar and specialist 
semantics. That makes it a cant or an argot.



-- 
Steven
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-04 Thread Nathan Rice
On Wed, Apr 4, 2012 at 1:49 AM, Steven D'Aprano
steve+comp.lang.pyt...@pearwood.info wrote:
 On Tue, 03 Apr 2012 13:17:18 -0400, Nathan Rice wrote:

 I have never met a programmer that was not completely into computers.
 That leaves a lot unspecified though.

 You haven't looked hard enough. There are *thousands* of VB, Java, etc.
 code monkeys who got into programming for the money only and who have
 zero inclination to expand their skills or knowledge beyond that
 necessary to keep their job.

Every programmer that I've ever met who got into it for the money has
washed out within about five years.  Sometimes they make a lateral
move to project management, other times they end up as requirements
analysts, and occasionally they become technical sales staff.  The
story is always the same - they do technical mediocre work, but get
along well with their peers, so they are transitioned to a role that
requires more people skills.

I've never met someone who had both poor people skills and mediocre
technical skills who actually kept their job.

 Go to programming blogs, and you will find many examples of some
 allegedly professional programmer selecting an arbitrary blog post to ask
 Pls sombody write me this code, where this code is either an utterly
 trivial question or a six month project.

Honestly, I have seen that, but usually when I inspect closer it is an
Indian ODesk or Rent-a-coder worker who oversold himself and is trying
to cover his ass.

 As part of my troll-outreach effort, I will indulge here.  I was
 specifically thinking about some earlier claims that programming
 languages as they currently exist are somehow inherently superior to a
 formalized natural language in expressive power.

 I would argue that they are, but only for the very limited purpose for
 which they are written. With the possible exception of Inform 7, most
 programming languages are useless at describing (say) human interactions.

 Human languages are optimised for many things, but careful, step-by-step
 algorithms are not one of them. This is why mathematicians use a
 specialist language for their problem domain, as do programmers. Human
 language is awfully imprecise and often ambiguous, it encourages implicit
 reasoning, and requires a lot of domain knowledge:

You have to be careful when you bring mathematical notation into the
picture.  Remember that mathematics has developed over thousands of
years, with developments shared in many languages.  Greek letters
serve the same purpose in math that latin and greek names serve in
biology - they are neutral and avoid confusion with common names in
living languages.  Not everything about mathematical notation is
good, and in some cases it suffers the same issues that programming
does.  Mathematicians have a tendency to be very terse, and although
some greek letters and symbols have standard meaning, many authors run
roughshod over them.  Logic is somewhat better than math in this
regard, logicians respect their notation and rarely deviate from the
standard meaning of symbols.  Things ARE getting better, but for the
most part it is still kind of a mess.

Also, I should clarify that I consider part of mathematical notation
to be natural language, namely +/-/*, and rational numbers.  People
discover these things on their own, mathematics just provides rigor.
 It is considered bad form to use them in prose, but that is just an
arbitrary style restriction; children intermix mathematical symbols
and language all the time, as to older students taking notes in a
variety of subjects.

    Joe snatched the hammer from Fred. Hey, he said, what are
    you doing? Don't you know that he'll hit the roof if he catches
    you with that?

Are you trying to get me to write obfuscated code?  You can write
ambiguous garbage in any language.

 The crux of my view is that programming languages exist in part because
 computers in general are not smart enough to converse with humans on
 their own level, so we have to talk to them like autistic 5 year-olds.
 That was fine when we didn't have any other options, but all the pieces
 exist now to let computers talk to us very close to our own level, and
 represent information at the same way we do.

 I think you're dreaming. We (that is to say, human beings in general, not
 you and I specifically) cannot even talk to each other accurately,
 precisely and unambiguously all the time. Natural language simply isn't
 designed for that -- hence we have specialist languages like legal
 jargon, mathematics, and programming languages, for specialist purposes.

Legalese is English with a ton of new words.  Mathematics is older
than most languages in current use and has a lot of baggage that is
(very) slowly being dealt with.

Programming really can't take the relaxed attitude about cleaning up
notation and vocabulary that we see in math.  Mathematicians spend a
lot of time thinking, and the transcription of their thoughts is a
relatively minor portion of their 

Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-04 Thread rusi
On Apr 3, 11:42 pm, Nathan Rice nathan.alexander.r...@gmail.com
wrote:
 Lets start with some analogies.  In cooking, chefs use recipes to
 produce a meal; the recipe is not a tool.  In architecture, a builder
 uses a blueprint to produce a building; the blueprint is not a tool.
 In manufacturing, expensive machines use plans to produce physical
 goods; the plans are not the tool.

 You could say the compiler is a tool, or a development environment is
 a tool.  The programming language is a mechanism for communication.

Long personal note ahead.
tl;dr version: Computers are such a large shift for human civilization
that generally we dont get what that shift is about or towards.
--
Longer version
My mother often tells me (with some awe): You are so clever! You know
how to use computers! (!?!?)

I try to tell her that a computer is not a machine like a car is (she
is better with things like cars than most of her generation).  Its
physical analogy to a typewriter is surprisingly accurate.  In fact
its more like a pen than other machines and its civilizational
significance is larger than Gutenbergs press and is on par with the
'invention' (or should I say discovery?) of language as a fundamental
fact of what it means to be human.

[At this point or thereabouts my communication attempt breaks down
because I am trying to tell her of the huge significance of
programming...]

A pen can be used to write love-letter or a death-sentence, a text-
book of anatomy or a symphony.
An yet it would be a bizarre superman who could do all these.
Likewise (I vainly try to communicate with my mother!) that I cant
design machines (with autocad) or paint (with photoshop) or ...
probably 99% of the things that people use computers for.
And so saying that I 'know computers' is on par with saying that
because I know (how to use a pen to) fill up income tax forms, I
should also know how to (use a pen to) write Shakespearean sonnets.

There is a sense in which a pen is a 'universal device.'  To some
extent the layman can get this.
There is a larger sense in which the computer is a universal device
(aka universal turing machine).
In my experience, not just 'my mother's' but even PhDs in computer
science dont get what this signifies.

This sense can (somewhat?) be appreciated if we see that the pen is
entirely a declarative tool
The computer is declarative+imperative.
The person who writes the love-letter needs the postman to deliver it.
The judge may write the death-sentence. A hangman is needed to execute
it.
When it comes to computers, the same device can write the love-letter/
death-sentence as the one which mails/controls the electric chair.

Let me end with a quote from Dijkstra: 
http://www.smaldone.com.ar/documentos/ewd/EWD1036_pretty.html

In the long run I expect computing science to transcend its parent
disciplines, mathematics and logic, by effectively realizing a
significant part of Leibniz's Dream of providing symbolic calculation
as an alternative to human reasoning. (Please note the difference
between mimicking and providing an alternative to: alternatives
are allowed to be better.)

Needless to say, this vision of what computing science is about is not
universally applauded. On the contrary, it has met widespread --and
sometimes even violent-- opposition from all sorts of directions. I
mention as examples

(0) the mathematical guild, which would rather continue to believe
that the Dream of Leibniz is an unrealistic illusion

(1) the business community, which, having been sold to the idea that
computers would make life easier, is mentally unprepared to accept
that they only solve the easier problems at the price of creating much
harder one

(2) the subculture of the compulsive programmer, whose ethics
prescribe that one silly idea and a month of frantic coding should
suffice to make him a life-long millionaire

(3) computer engineering, which would rather continue to act as if it
is all only a matter of higher bit rates and more flops per second

(4) the military, who are now totally absorbed in the business of
using computers to mutate billion-dollar budgets into the illusion of
automatic safety

(5) all soft sciences for which computing now acts as some sort of
interdisciplinary haven

(6) the educational business that feels that, if it has to teach
formal mathematics to CS students, it may as well close its schools.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-04 Thread Steve Howell
On Apr 3, 11:19 pm, Steven D'Aprano steve
+comp.lang.pyt...@pearwood.info wrote:
 On Tue, 03 Apr 2012 08:39:14 -0400, Nathan Rice wrote:
  Much like
  with the terminal to GUI transition, you will have people attacking
  declarative natural language programming as a stupid practice for noobs,
  and the end of computing (even though it will allow people with much
  less experience to be more productive than them).

 I cry every time I consider GUI programming these days.

 In the late 1980s and early 1990s, Apple released a product, Hypercard,
 that was a combination GUI framework and natural-ish language programming
 language. It was an astonishing hit with non-programmers, as it allowed
 people to easily move up from point and click programming to real
 programming as their skills improved.

 Alas, it has been abandoned by Apple, and while a few of its intellectual
 successors still exit, it very niche.

 I *really* miss Hypercard. Not so much for the natural language syntax,
 as for the astonishingly simple and obvious GUI framework.

 To get a flavour of the syntax, see OpenXION:

 http://www.openxion.org

 and for a hint of the framework, see Pythoncard:

 http://pythoncard.sourceforge.net

  Ultimately, the answers to your questions exist in the world for you to
  see.  How does a surgeon describe a surgical procedure?  How does a chef
  describe a recipe?  How does a carpenter describe the process of
  building cabinets?  Aside from specific words, they all use natural
  language, and it works just fine.

 No they don't. In general they don't use written language at all, but
 when they are forced to, they use a combination of drawings or
 illustrations plus a subset of natural language plus specialist jargon.

 Programming languages include both specialist grammar and specialist
 semantics. That makes it a cant or an argot.

The building cabinets problem is interesting:

  1. To actually build a cabinet, there's a lot of domain knowledge
that's probably implicit in most circumstances.  A carpenter might
tell another carpenter which hinge to use, but they won't have to talk
about why doors need hinges or how to do the assembly.
  2. It's quite common for humans to use computer programs as part of
the design process.
  3. Often, the output of a CAD program (at the file level) is some
sort of vector representation that only describes the end product
(basic dimensions, etc.).

I wonder if there are mini-languages out there that allow you to
describe cabinets in a very descriptive way, where the description
easily translates to the actual steps of building the cabinet, not
just the final dimensions.


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-04 Thread Nathan Rice
 Long personal note ahead.
 tl;dr version: Computers are such a large shift for human civilization
 that generally we dont get what that shift is about or towards.

Another option: since *computers* are such a general device, there
isn't just one notion.

 In the long run I expect computing science to transcend its parent
 disciplines, mathematics and logic, by effectively realizing a
 significant part of Leibniz's Dream of providing symbolic calculation
 as an alternative to human reasoning. (Please note the difference
 between mimicking and providing an alternative to: alternatives
 are allowed to be better.)

A thinking machine.  +1.

 Needless to say, this vision of what computing science is about is not
 universally applauded. On the contrary, it has met widespread --and
 sometimes even violent-- opposition from all sorts of directions. I
 mention as examples

 (0) the mathematical guild, which would rather continue to believe
 that the Dream of Leibniz is an unrealistic illusion

Mathematics is not a closet guild, it is large and contentious.  Ideas
live and die in mathematics based on their fundamental truth.  If
there is some bold, sweeping statement it *MIGHT* be possible to prove
or disprove, mathematicians will be all over it.  just look at
Fermat's last theorem and the Poincare conjecture if you want proof of
this.

 (1) the business community, which, having been sold to the idea that
 computers would make life easier, is mentally unprepared to accept
 that they only solve the easier problems at the price of creating much
 harder one

Most business people I know secretly love when they can sell a
solution to one problem that creates new problems (and thus
opportunities for new products!).  The business term for this is an
Upsell or Value-add.

 (2) the subculture of the compulsive programmer, whose ethics
 prescribe that one silly idea and a month of frantic coding should
 suffice to make him a life-long millionaire

I love hacker culture, but it has been infected by the idea of
entrepreneurship as a good in and of itself.  Being a creator is a
beautiful thing, go forth and make *art*.  Improve the human
condition.  Make the world a better place.  STFU about venture capital
and stage 2 funding and minimum viable products; that sort of talk is
a sure sign that you haven't created anything of actual value.

 (3) computer engineering, which would rather continue to act as if it
 is all only a matter of higher bit rates and more flops per second

These guys are doing something that I find very uninteresting, but is
absolutely necessary.  Bravo I say.

 (4) the military, who are now totally absorbed in the business of
 using computers to mutate billion-dollar budgets into the illusion of
 automatic safety

Nations will always try and be imperialist.  At least drones and robot
soldiers mean less human suffering.

 (5) all soft sciences for which computing now acts as some sort of
 interdisciplinary haven

Digital humanities (outside of a VERY small set of projects) is a
joke.  Multimedia history presentations (and what not) are the domain
of edutainment companies, not academia.

 (6) the educational business that feels that, if it has to teach
 formal mathematics to CS students, it may as well close its schools.

I feel quite the opposite actually.  At the really top notch computer
science schools, there is a clear mathematical bent (though it is
interdisciplinary).  Places like MIT, Stanford, Berkeley, CMU,
Cambridge, etc make a STRONG effort to separate the
mathematical/theory of computation side and engineering side.  At your
average state college, the computer science department is just a
hodgepodge, and you tend to see more graphics, applied computation
and embedded/DSP type people.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-04 Thread Nathan Rice
 The building cabinets problem is interesting:

  1. To actually build a cabinet, there's a lot of domain knowledge
 that's probably implicit in most circumstances.  A carpenter might
 tell another carpenter which hinge to use, but they won't have to talk
 about why doors need hinges or how to do the assembly.
  2. It's quite common for humans to use computer programs as part of
 the design process.
  3. Often, the output of a CAD program (at the file level) is some
 sort of vector representation that only describes the end product
 (basic dimensions, etc.).

 I wonder if there are mini-languages out there that allow you to
 describe cabinets in a very descriptive way, where the description
 easily translates to the actual steps of building the cabinet, not
 just the final dimensions.

I think if you were to describe the parts of the cabinet that needed
to be assembled separately (and thus could be viewed as separate
entities in some sense) and showed the cabinet as the composition of
those parts, you would be on the right track.  Being a mediocre
carpenter, I can't really say anything conclusively here though :)
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Chris Angelico
On Tue, Apr 3, 2012 at 8:05 AM, Dennis Lee Bieber wlfr...@ix.netcom.com wrote:
 On Thu, 29 Mar 2012 08:48:53 -0700 (PDT), Steve Howell
 showel...@yahoo.com declaimed the following in
 gmane.comp.python.general:

        REXX is inhibited by the architectures to which it has been ported
 -- limiting the ADDRESS targets to variations of Python's os.system() or
 popen() calls; even the subprocess module can't get beyond the all or
 nothing execution model.

        Much different from the original IBM environment (and, biased, the
 Amiga implementation may have gone beyond the IBM one in capabilities)
 wherein compliant programs become the command shell for REXX command
 processing -- actual bidirectional interactive interprocess
 communication. Window's COM system offers some of that capability, but
 buried in a cryptic object programming system -- nothing like having a
 script sending the /same/ command that one would use in a text interface
 to the target program.

Some years ago, I wrote a MUD that used REXX as its scripting
language. (The server's still running, but not so much to be a MUD as
to be my administrative interface to that particular box. I'm like
that with MUDs.) I thought it was really cool to be able to simply put
a bare string and have that get sent to the player - for instance:

/* Command handler for some particular location in the MUD */
if arg(1)=foo then do
You begin to foo.
/* do some stuff */
You finish fooing.
end

Having now built MUDs in Pike, I'm not so impressed with the syntax.
But hey, it's a completely different use of the ADDRESS command! :)
And of course, I can always use ADDRESS CMD blah blah to execute
commands.

On Tue, Apr 3, 2012 at 10:25 AM, Steve Howell showel...@yahoo.com wrote:
 On Apr 2, 2:50 pm, Chris Angelico ros...@gmail.com wrote:
 Hmm... How do you pipe one command's output into another's input using
 Python? It's not nearly as clean as it is in bash.

 For pipes, I'd still call out to bash.  I know that's cheating, but
 the idea is that Python can wrap all the good parts of bash while
 still allowing you to use Python's more modern syntax, standard
 library, etc.

So, it's not that Python is a superset of bash, but that Python+bash
is a superset of bash. Well, that is certainly understandable. And
needn't be too onerous syntactically either:

from os import system as x

x('do_stuff')

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Nathan Rice
On Tue, Apr 3, 2012 at 1:40 AM, alex23 wuwe...@gmail.com wrote:
 On Apr 3, 2:55 pm, Nathan Rice nathan.alexander.r...@gmail.com
 wrote:
 I don't care what people do related to legacy systems.

 And that's what earns you the label 'architecture astronaut'. Legacy
 systems are _part_ of the problem; it's very easy to  hold to a purist
 approach when you ignore the bulk of the domain that causes the
 issues. There's _never_ going to be an InfoTech3k where we just stop
 supporting older code.

There are people who are paid pretty well to support crappy old COBOL
apps, but I am not among them (nor are you, with very high
likelihood), so your we is misplaced.  For all intents and purposes
that software exists in an alternate reality.

Remember the tutorial on global vs local optimization I made
previously?  Let me distill it... If you are unwilling to endure pain
to move towards a better world you will always be trapped in a
sub-optimal situation.

 I do care about programmers that are too lazy to
 learn, and would be happy to ignore the fact that programming is hard
 for most people to learn, so they can continue not learning.  Those
 programmers are scumbags.

 Wait, what?

 Programmers are both too lazy to learn and yet somehow happy that
 the skills they've acquired are too hard for most people to learn?
 So how did they learn them?

 And they're also somehow lazy because they have to learn multiple
 languages to be effective,  rather than one mythical ur-language?

 In my 20 years as a software developer, I have _never_ encountered
 anyone trying to deliberately expand the knowledge gap. This isn't a
 priesthood.

Did you miss the part where I said that most people who learn to
program are fascinated by computers and highly motivated to do so?
I've never met a BROgrammer, those people go into sales.  It isn't
because there aren't smart BROmosapiens (sadly, there are), they just
couldn't give two shits about computers so programming seems like a
colossal waste of time to them.

It isn't about people scheming to dis-empower then plebs rather it
is about people who don't want to move outside their comfort zone.
You can talk about people learning multiple languages all you want,
but for the most part they will be 10 descendants of ALGOL, with minor
variations.  Very few people are willing to tackle something like
Haskell or ML if they weren't taught functional programming in
university, though there are a few that view it as an endurance trial
or mountain to climb.  Those people get a pass on most of what I've
said thus far.

 Just don't let me hear you complaining because some syntax is not C
 like enough for you.  Whenever I hear that I want to strangle the
 self-serving 'tard that wrote it.  When I see people defending C
 like syntax as optimal or somehow much more expressive, that makes me
 doubly irritated.  These are the people who are selfishly defending
 the status quo because they're invested.

 Syntax is never the issue, it's the deeper semantics. Is the scoping
 of one C-like language the same as C? How does it differ? Why does it
 differ? Is the difference a fundamental implementation issue that you
 really need to know before you actually grok the language? Are
 functions first-class objects? Are they actual objects or some kind of
 magical stub? Can you extend those objects with properties? etc etc

Syntax and semantics are both a big mess right now.  That is why I
always address them both.

 Every language tackles _so many_ things differently. It's not lazy to
 say that you prefer something to resemble/be based on a language you
 have experience with, that's human nature. If you're insistent that
 your non-typical syntax is so much better, the onus is on you to prove
 it, not to insist that the lack of uptake is 'laziness'.

The winds of change generally blow for programming when generations of
older programmers leave the workforce.  Alan Kay was a smart man,
viewing programming as an educational tool and designing for youth is
absolutely the right way to do things.  If you try to retrain older
programmers, you are basically telling them they have to change
decades of learning for a moderate (but not huge) productivity
increase, so that programming is accessible to a much wider group of
people.  Much like with the terminal to GUI transition, you will have
people attacking declarative natural language programming as a stupid
practice for noobs, and the end of computing (even though it will
allow people with much less experience to be more productive than
them).

 And one again: code is _communication_. Not having to understand new
 optimal patterns for every single language is a Good Thing.

Code is a horrible medium for communication.  If it weren't, I
wouldn't be trolling this thread.

 Don't try to delude people that our modern
 ALGOL derivatives are the best possible way to model knowledge
 (including process knowledge) to a computer, because that is a lie.

 Um, okay, I'll stop doing 

Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread rusi
On Apr 3, 5:39 pm, Nathan Rice nathan.alexander.r...@gmail.com
wrote:

 Don't think underlying, instead think canonical.

 Ultimately, the answers to your questions exist in the world for you
 to see.  How does a surgeon describe a surgical procedure?  How does a
 chef describe a recipe?  How does a carpenter describe the process of
 building cabinets?  Aside from specific words, they all use natural
 language, and it works just fine.

A carpenter describes his carpentry-process in English
A CSist describes his programming-process in English (at least all my
CS books are in English)

A carpenter uses his tools -- screwdriver, saw, planer --to do
carpentry
A programmer uses his tools to to programming -- one of which is
called 'programming language'

Doing programming without programming languages is like using toenails
to tighten screws
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Mark Lawrence

On 03/04/2012 14:51, rusi wrote:

On Apr 3, 5:39 pm, Nathan Ricenathan.alexander.r...@gmail.com
wrote:


Don't think underlying, instead think canonical.

Ultimately, the answers to your questions exist in the world for you
to see.  How does a surgeon describe a surgical procedure?  How does a
chef describe a recipe?  How does a carpenter describe the process of
building cabinets?  Aside from specific words, they all use natural
language, and it works just fine.


A carpenter describes his carpentry-process in English
A CSist describes his programming-process in English (at least all my
CS books are in English)

A carpenter uses his tools -- screwdriver, saw, planer --to do
carpentry
A programmer uses his tools to to programming -- one of which is
called 'programming language'

Doing programming without programming languages is like using toenails
to tighten screws


The latter is extremely difficult if you bite your toenails :)

--
Cheers.

Mark Lawrence.

--
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Chris Angelico
On Wed, Apr 4, 2012 at 12:26 AM, Mark Lawrence breamore...@yahoo.co.uk wrote:
 On 03/04/2012 14:51, rusi wrote:
 Doing programming without programming languages is like using toenails
 to tighten screws


 The latter is extremely difficult if you bite your toenails :)

I agree, thumbnails are far better suited. Mine are often pushed into
that service. But to extend the analogy: Using a thumbnail to tighten
a screw is like directly patching a binary to fix a bug. It works, but
it's not exactly a practical way to build a system.

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Grant Edwards
On 2012-04-03, Chris Angelico ros...@gmail.com wrote:
 On Wed, Apr 4, 2012 at 12:26 AM, Mark Lawrence breamore...@yahoo.co.uk 
 wrote:
 On 03/04/2012 14:51, rusi wrote:
 Doing programming without programming languages is like using toenails
 to tighten screws


 The latter is extremely difficult if you bite your toenails :)

 I agree, thumbnails are far better suited. Mine are often pushed into
 that service. But to extend the analogy: Using a thumbnail to tighten
 a screw is like directly patching a binary to fix a bug. It works, but
 it's not exactly a practical way to build a system.

Anybody remember DEC's VAX/VMS patch utility?  Apparently, DEC
thought it was a practical way to fix things.  It had a built-in
assembler and let you insert new code into a function by
auto-allocating a location for the new code an hooking it into the
indicated spot with jump instructions.

The mind wobbled.

-- 
Grant Edwards   grant.b.edwardsYow! I'm a fuschia bowling
  at   ball somewhere in Brittany
  gmail.com
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Chris Angelico
On Wed, Apr 4, 2012 at 12:46 AM, Grant Edwards invalid@invalid.invalid wrote:
 Anybody remember DEC's VAX/VMS patch utility?  Apparently, DEC
 thought it was a practical way to fix things.  It had a built-in
 assembler and let you insert new code into a function by
 auto-allocating a location for the new code an hooking it into the
 indicated spot with jump instructions.

 The mind wobbled.

Not specifically, but I _have_ heard of various systems whose source
code and binary were multiple years divergent. It's actually not a
difficult trap to fall into, especially once you start patching
running systems. I've had quite a few computers that have been unable
to reboot without assistance, because they go for months or years
without ever having to go through that initial program load. (I've had
_programs_ that were unable to load, for the same reason.) But
auto-allocating a new spot for your expanded function? That's just...
awesome. My mind is, indeed, wobbling.

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Ian Kelly
On Tue, Apr 3, 2012 at 6:39 AM, Nathan Rice
nathan.alexander.r...@gmail.com wrote:
 Did you miss the part where I said that most people who learn to
 program are fascinated by computers and highly motivated to do so?
 I've never met a BROgrammer, those people go into sales.  It isn't
 because there aren't smart BROmosapiens (sadly, there are), they just
 couldn't give two shits about computers so programming seems like a
 colossal waste of time to them.

I have never met the brogrammer stereotype.  I have also never met the
non-brogrammer stereotype of nerdy solitude (well, maybe once).
That's all these things are -- stereotypes.  Real programmers are much
more complex.

 Computers require you to state the exact words you're searching for as
 well.  Try looking again, and this time allow for sub-categories and
 synonyms, along with some variation in word order.

Lazy troll.  You made the claim.  The onus is on you to provide the evidence.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Chris Angelico
On Wed, Apr 4, 2012 at 1:01 AM, Ian Kelly ian.g.ke...@gmail.com wrote:
 Real programmers are much more complex.

Are you saying that some part of all of us is imaginary??

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Mark Lawrence

On 03/04/2012 15:56, Chris Angelico wrote:

On Wed, Apr 4, 2012 at 12:46 AM, Grant Edwardsinvalid@invalid.invalid  wrote:

Anybody remember DEC's VAX/VMS patch utility?  Apparently, DEC
thought it was a practical way to fix things.  It had a built-in
assembler and let you insert new code into a function by
auto-allocating a location for the new code an hooking it into the
indicated spot with jump instructions.

The mind wobbled.


Not specifically, but I _have_ heard of various systems whose source
code and binary were multiple years divergent. It's actually not a
difficult trap to fall into, especially once you start patching
running systems. I've had quite a few computers that have been unable
to reboot without assistance, because they go for months or years
without ever having to go through that initial program load. (I've had
_programs_ that were unable to load, for the same reason.) But
auto-allocating a new spot for your expanded function? That's just...
awesome. My mind is, indeed, wobbling.

ChrisA


Around 1990 I worked on Telematics kit.  The patches on all their 
software were implemented via assembler once the original binary had 
been loaded into memory.  They even came up with a system that let you 
select which patches you wanted and which you didn't, as e.g. some 
patches were customer specific.


--
Cheers.

Mark Lawrence.

--
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Nathan Rice
On Tue, Apr 3, 2012 at 9:51 AM, rusi rustompm...@gmail.com wrote:
 On Apr 3, 5:39 pm, Nathan Rice nathan.alexander.r...@gmail.com
 wrote:

 Don't think underlying, instead think canonical.

 Ultimately, the answers to your questions exist in the world for you
 to see.  How does a surgeon describe a surgical procedure?  How does a
 chef describe a recipe?  How does a carpenter describe the process of
 building cabinets?  Aside from specific words, they all use natural
 language, and it works just fine.

 A carpenter describes his carpentry-process in English
 A CSist describes his programming-process in English (at least all my
 CS books are in English)

 A carpenter uses his tools -- screwdriver, saw, planer --to do
 carpentry
 A programmer uses his tools to to programming -- one of which is
 called 'programming language'

 Doing programming without programming languages is like using toenails
 to tighten screws

I would argue that the computer is the tool, not the language.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Dave Angel
On 04/03/2012 11:16 AM, Mark Lawrence wrote:
 On 03/04/2012 15:56, Chris Angelico wrote:
 On Wed, Apr 4, 2012 at 12:46 AM, Grant
 Edwardsinvalid@invalid.invalid  wrote:
 Anybody remember DEC's VAX/VMS patch utility?  Apparently, DEC
 thought it was a practical way to fix things.  It had a built-in
 assembler and let you insert new code into a function by
 auto-allocating a location for the new code an hooking it into the
 indicated spot with jump instructions.

 The mind wobbled.

 Not specifically, but I _have_ heard of various systems whose source
 code and binary were multiple years divergent. It's actually not a
 difficult trap to fall into, especially once you start patching
 running systems. I've had quite a few computers that have been unable
 to reboot without assistance, because they go for months or years
 without ever having to go through that initial program load. (I've had
 _programs_ that were unable to load, for the same reason.) But
 auto-allocating a new spot for your expanded function? That's just...
 awesome. My mind is, indeed, wobbling.

 ChrisA

 Around 1990 I worked on Telematics kit.  The patches on all their
 software were implemented via assembler once the original binary had
 been loaded into memory.  They even came up with a system that let you
 select which patches you wanted and which you didn't, as e.g. some
 patches were customer specific.


And I worked on a system where the microcode was in ROM, and there was a
patch board consisting of lots of diodes and some EPROMs.  The diodes
were soldered into place to specfy the instruction(s) to be patched, and
the actual patches were in the EPROMs, which were reusable.  The diodes
were the only thing fast enough to patch the ROM, by responding more
quickly than the ROM.  This was back when issuing a new ROM was a very
expensive proposition;  there were masking charges, so you couldn't
reasonably do low quantities.



-- 

DaveA

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Nathan Rice
On Tue, Apr 3, 2012 at 11:01 AM, Ian Kelly ian.g.ke...@gmail.com wrote:
 On Tue, Apr 3, 2012 at 6:39 AM, Nathan Rice
 nathan.alexander.r...@gmail.com wrote:
 Did you miss the part where I said that most people who learn to
 program are fascinated by computers and highly motivated to do so?
 I've never met a BROgrammer, those people go into sales.  It isn't
 because there aren't smart BROmosapiens (sadly, there are), they just
 couldn't give two shits about computers so programming seems like a
 colossal waste of time to them.

 I have never met the brogrammer stereotype.  I have also never met the
 non-brogrammer stereotype of nerdy solitude (well, maybe once).
 That's all these things are -- stereotypes.  Real programmers are much
 more complex.

I have never met a programmer that was not completely into computers.
That leaves a lot unspecified though.

 Computers require you to state the exact words you're searching for as
 well.  Try looking again, and this time allow for sub-categories and
 synonyms, along with some variation in word order.

 Lazy troll.  You made the claim.  The onus is on you to provide the evidence.

I reserve the right to be lazy :)

As part of my troll-outreach effort, I will indulge here.  I was
specifically thinking about some earlier claims that programming
languages as they currently exist are somehow inherently superior to a
formalized natural language in expressive power.

I think part of this comes from the misconception that terse is better
(e.g. Paul Graham's thoughts on car/cdr), which doesn't take into
account that your brain compresses frequently occurring English words
VERY efficiently, so they actually take up less cognitive bandwidth
than a much shorter non-word.  This behavior extends to the phrase
level as well; longer phrases that are meaningful in their own right
take up less bandwidth than short nonsensical word combinations.

On the semantic side, most people already understand branched
processes and procedures with conditional actions pretty well.  People
program other people to perform tasks constantly, and have been
doing so for the entirety of our existence.  The problem occurs when
programming language specific semantic artifacts must be considered.
These artifacts are for the most part somewhat arbitrary, or you would
see them frequently in other areas, and they wouldn't confuse people
so much.  I think the majority of these relate to how the computer
operates internally - this is the stuff that really turns most people
off to programming.

The crux of my view is that programming languages exist in part
because computers in general are not smart enough to converse with
humans on their own level, so we have to talk to them like autistic 5
year-olds.  That was fine when we didn't have any other options, but
all the pieces exist now to let computers talk to us very close to our
own level, and represent information at the same way we do.  Projects
like IBM's Watson, Siri, Wolfram Alpha and Cyc demonstrate pretty
clearly to me that we are capable of taking the next step, and the
resurgence of the technology sector along with the shortage of
qualified developers indicates to me that we need to move now.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread rusi
On Apr 3, 9:15 pm, Nathan Rice nathan.alexander.r...@gmail.com
wrote:
 On Tue, Apr 3, 2012 at 9:51 AM, rusi rustompm...@gmail.com wrote:
  On Apr 3, 5:39 pm, Nathan Rice nathan.alexander.r...@gmail.com
  wrote:

  Don't think underlying, instead think canonical.

  Ultimately, the answers to your questions exist in the world for you
  to see.  How does a surgeon describe a surgical procedure?  How does a
  chef describe a recipe?  How does a carpenter describe the process of
  building cabinets?  Aside from specific words, they all use natural
  language, and it works just fine.

  A carpenter describes his carpentry-process in English
  A CSist describes his programming-process in English (at least all my
  CS books are in English)

  A carpenter uses his tools -- screwdriver, saw, planer --to do
  carpentry
  A programmer uses his tools to to programming -- one of which is
  called 'programming language'

  Doing programming without programming languages is like using toenails
  to tighten screws

 I would argue that the computer is the tool, not the language.

Computer science is as much about computers as astronomy is about
telescopes -- E W Dijkstra

Here are some other attempted corrections of the misnomer computer
science:
http://en.wikipedia.org/wiki/Computer_science#Name_of_the_field
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Neil Cerutti
On 2012-04-03, Dave Angel d...@davea.name wrote:
 And I worked on a system where the microcode was in ROM, and
 there was a patch board consisting of lots of diodes and some
 EPROMs.  The diodes were soldered into place to specfy the
 instruction(s) to be patched, and the actual patches were in
 the EPROMs, which were reusable.  The diodes were the only
 thing fast enough to patch the ROM, by responding more
 quickly than the ROM.  This was back when issuing a new ROM was
 a very expensive proposition;  there were masking charges, so
 you couldn't reasonably do low quantities.

I worked on a system where the main interface to the system was
poking and peeking numbers at memory addresses.

-- 
Neil Cerutti
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread rusi
All this futuristic grandiloquence:

On Apr 3, 10:17 pm, Nathan Rice nathan.alexander.r...@gmail.com
wrote:
 The crux of my view is that programming languages exist in part
 because computers in general are not smart enough to converse with
 humans on their own level, so we have to talk to them like autistic 5
 year-olds.  That was fine when we didn't have any other options, but
 all the pieces exist now to let computers talk to us very close to our
 own level, and represent information at the same way we do.  Projects
 like IBM's Watson, Siri, Wolfram Alpha and Cyc demonstrate pretty
 clearly to me that we are capable of taking the next step, and the
 resurgence of the technology sector along with the shortage of
 qualified developers indicates to me that we need to move now.

needs to be juxtaposed with this antiquated view

 I would argue that the computer is the tool, not the language.


... a view that could not be held by an educated person after the
1960s -- ie when it became amply clear to all that the essential and
hard issues in CS are about software and not hardware
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Nathan Rice
  A carpenter uses his tools -- screwdriver, saw, planer --to do
  carpentry
  A programmer uses his tools to to programming -- one of which is
  called 'programming language'

  Doing programming without programming languages is like using toenails
  to tighten screws

 I would argue that the computer is the tool, not the language.

 Computer science is as much about computers as astronomy is about
 telescopes -- E W Dijkstra

 Here are some other attempted corrections of the misnomer computer
 science:
 http://en.wikipedia.org/wiki/Computer_science#Name_of_the_field

I view computer science as applied mathematics, when it deserves
that moniker.  When it doesn't, it is merely engineering.

Ironically, telescopes are a tool that astronomers use to view the stars.


On Tue, Apr 3, 2012 at 1:25 PM, rusi rustompm...@gmail.com wrote:
 All this futuristic grandiloquence:

 On Apr 3, 10:17 pm, Nathan Rice nathan.alexander.r...@gmail.com
 wrote:
 The crux of my view is that programming languages exist in part
 because computers in general are not smart enough to converse with
 humans on their own level, so we have to talk to them like autistic 5
 year-olds.  That was fine when we didn't have any other options, but
 all the pieces exist now to let computers talk to us very close to our
 own level, and represent information at the same way we do.  Projects
 like IBM's Watson, Siri, Wolfram Alpha and Cyc demonstrate pretty
 clearly to me that we are capable of taking the next step, and the
 resurgence of the technology sector along with the shortage of
 qualified developers indicates to me that we need to move now.

 needs to be juxtaposed with this antiquated view

 I would argue that the computer is the tool, not the language.


 ... a view that could not be held by an educated person after the
 1960s -- ie when it became amply clear to all that the essential and
 hard issues in CS are about software and not hardware

I'll go ahead and forgive the club handed fallacies, so we can have a
nice discussion of your primary point.  What a civil troll I am :)

Lets start with some analogies.  In cooking, chefs use recipes to
produce a meal; the recipe is not a tool.  In architecture, a builder
uses a blueprint to produce a building; the blueprint is not a tool.
In manufacturing, expensive machines use plans to produce physical
goods; the plans are not the tool.

You could say the compiler is a tool, or a development environment is
a tool.  The programming language is a mechanism for communication.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Terry Reedy

On 4/3/2012 8:39 AM, Nathan Rice wrote:


Ultimately, the answers to your questions exist in the world for you
to see.  How does a surgeon describe a surgical procedure?  How does a
chef describe a recipe?  How does a carpenter describe the process of
building cabinets?  Aside from specific words, they all use natural
language, and it works just fine.


Not really. Surgeon's learn by *watching* a surgeon who knows the 
operation and next (hopefully) doing a particular surgery under 
supervision of such a surgeon, who watches and talks, and may even grab 
the instruments and re-show. They then really learn by doing the 
procedure on multiple people. They often kill a few on the way to mastery.


I first learned basic carpentry and other skills by watching my father. 
I don't remember that he ever said anything about how to hold the tools.


I similarly learned basic cooking by watching my mom. My knowledge of 
how to crack open an egg properly and separate the yolk from the rest is 
a wordless memory movie.


--
Terry Jan Reedy

--
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Nathan Rice
On Tue, Apr 3, 2012 at 4:20 PM, Terry Reedy tjre...@udel.edu wrote:
 On 4/3/2012 8:39 AM, Nathan Rice wrote:

 Ultimately, the answers to your questions exist in the world for you
 to see.  How does a surgeon describe a surgical procedure?  How does a
 chef describe a recipe?  How does a carpenter describe the process of
 building cabinets?  Aside from specific words, they all use natural
 language, and it works just fine.


 Not really. Surgeon's learn by *watching* a surgeon who knows the operation
 and next (hopefully) doing a particular surgery under supervision of such a
 surgeon, who watches and talks, and may even grab the instruments and
 re-show. They then really learn by doing the procedure on multiple people.
 They often kill a few on the way to mastery.

Well, there is declarative knowledge and procedural knowledge.  In all
these cases, only the procedural knowledge is absolutely necessary,
but the declarative knowledge is usually a prerequisite to learning
the procedure in any sort of reasonable manner.

 I first learned basic carpentry and other skills by watching my father. I
 don't remember that he ever said anything about how to hold the tools.

 I similarly learned basic cooking by watching my mom. My knowledge of how to
 crack open an egg properly and separate the yolk from the rest is a wordless
 memory movie.

A picture is worth a thousand words :)

If you would like, I don't have any problem incorporating visual
programming and programming by demonstration.  I didn't go in that
direction because I have enough to defend as it is.  I like to look at
it from the perspective of teaching/communicating, rather than
operating a simple machine.
-- 
http://mail.python.org/mailman/listinfo/python-list


RE: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Phil Runciman

 -Original Message-
 From: Mark Lawrence [mailto:breamore...@yahoo.co.uk]
 Sent: Wednesday, 4 April 2012 3:16 a.m.
 To: python-list@python.org
 Subject: Re: Number of languages known [was Re: Python is readable] -
 somewhat OT
 
 On 03/04/2012 15:56, Chris Angelico wrote:
  On Wed, Apr 4, 2012 at 12:46 AM, Grant
 Edwardsinvalid@invalid.invalid  wrote:
  Anybody remember DEC's VAX/VMS patch utility?  Apparently, DEC
  thought it was a practical way to fix things.  It had a built-in
  assembler and let you insert new code into a function by
  auto-allocating a location for the new code an hooking it into the
  indicated spot with jump instructions.
 
  The mind wobbled.
 
  Not specifically, but I _have_ heard of various systems whose source
  code and binary were multiple years divergent. It's actually not a
  difficult trap to fall into, especially once you start patching
  running systems. I've had quite a few computers that have been unable
  to reboot without assistance, because they go for months or years
  without ever having to go through that initial program load. (I've
 had
  _programs_ that were unable to load, for the same reason.) But
  auto-allocating a new spot for your expanded function? That's just...
  awesome. My mind is, indeed, wobbling.
 
  ChrisA
 
 Around 1990 I worked on Telematics kit.  The patches on all their
 software were implemented via assembler once the original binary had
 been loaded into memory.  They even came up with a system that let you
 select which patches you wanted and which you didn't, as e.g. some
 patches were customer specific.
 
 --
 Cheers.
 
 Mark Lawrence.
 

In the 70's I worked with Honeywell 16 Series computers controlling a variety 
of systems. The patches were loaded as a starting address followed by machine 
code, using a piece of software for this purpose. This all sounds rather 
similar to Mark's situation. The reason however is less obvious. On the H16 
series we did not have a multi-access O/S and the process of assembling and 
linking a large system involved many steps. Often the modifications required 
were trivial. It was generally easier to reload a memory dump from off paper 
tape and then apply the patches.


Phil Runciman
-- 
http://mail.python.org/mailman/listinfo/python-list


RE: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Phil Runciman
 
 On Tue, Apr 3, 2012 at 4:20 PM, Terry Reedy tjre...@udel.edu wrote:

  On 4/3/2012 8:39 AM, Nathan Rice wrote:
 
   Ultimately, the answers to your questions exist in the world for you
   to see.  How does a surgeon describe a surgical procedure?  How does
   a chef describe a recipe?  How does a carpenter describe the process
   of building cabinets?  Aside from specific words, they all use 
   natural language, and it works just fine.
 
 
  Not really. Surgeon's learn by *watching* a surgeon who knows the operation
  and next (hopefully) doing a particular surgery under supervision of such a
  surgeon, who watches and talks, and may even grab the instruments and
  re-show. They then really learn by doing the procedure on multiple
  people. They often kill a few on the way to mastery.
  
 
 Well, there is declarative knowledge and procedural knowledge.  In all
 these cases, only the procedural knowledge is absolutely necessary,
 but the declarative knowledge is usually a prerequisite to learning
 the procedure in any sort of reasonable manner.

There is also tacit knowledge. Such knowledge is a precursor to declarative 
knowledge and therefore procedural knowledge. Tacit knowledge is not easily 
shared. It involves learning and skill, but not in a way that can be written 
down. Tacit knowledge consists often of habits and culture that we do not 
recognize in ourselves. Wikipedia.

The process of eliciting tacit knowledge may be time consuming and require 
patience and skill. The following book covers aspects of this: Nonaka, Ikujiro; 
Takeuchi, Hirotaka (1995), The knowledge creating company: how Japanese 
companies create the dynamics of innovation. 

Phil Runciman
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Mark Lawrence

On 03/04/2012 19:42, Nathan Rice wrote:


I view computer science as applied mathematics, when it deserves
that moniker.  When it doesn't, it is merely engineering.



Is it still April first in your time zone?

--
Cheers.

Mark Lawrence.

--
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-03 Thread Steven D'Aprano
On Tue, 03 Apr 2012 13:17:18 -0400, Nathan Rice wrote:

 I have never met a programmer that was not completely into computers.
 That leaves a lot unspecified though.

You haven't looked hard enough. There are *thousands* of VB, Java, etc. 
code monkeys who got into programming for the money only and who have 
zero inclination to expand their skills or knowledge beyond that 
necessary to keep their job.

Go to programming blogs, and you will find many examples of some 
allegedly professional programmer selecting an arbitrary blog post to ask 
Pls sombody write me this code, where this code is either an utterly 
trivial question or a six month project.


 As part of my troll-outreach effort, I will indulge here.  I was
 specifically thinking about some earlier claims that programming
 languages as they currently exist are somehow inherently superior to a
 formalized natural language in expressive power.

I would argue that they are, but only for the very limited purpose for 
which they are written. With the possible exception of Inform 7, most 
programming languages are useless at describing (say) human interactions.

Human languages are optimised for many things, but careful, step-by-step 
algorithms are not one of them. This is why mathematicians use a 
specialist language for their problem domain, as do programmers. Human 
language is awfully imprecise and often ambiguous, it encourages implicit 
reasoning, and requires a lot of domain knowledge:

Joe snatched the hammer from Fred. Hey, he said, what are
you doing? Don't you know that he'll hit the roof if he catches
you with that?


 I think part of this comes from the misconception that terse is better

+1


 The crux of my view is that programming languages exist in part because
 computers in general are not smart enough to converse with humans on
 their own level, so we have to talk to them like autistic 5 year-olds. 
 That was fine when we didn't have any other options, but all the pieces
 exist now to let computers talk to us very close to our own level, and
 represent information at the same way we do.

I think you're dreaming. We (that is to say, human beings in general, not 
you and I specifically) cannot even talk to each other accurately, 
precisely and unambiguously all the time. Natural language simply isn't 
designed for that -- hence we have specialist languages like legal 
jargon, mathematics, and programming languages, for specialist purposes.



-- 
Steven
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-02 Thread Ethan Furman

Tim Rowe wrote:

On 22 March 2012 19:14, Chris Angelico ros...@gmail.com wrote:


In any case, though, I agree that there's a lot of people
professionally writing code who would know about the 3-4 that you say.
I'm just not sure that they're any good at coding, even in those few
languages. All the best people I've ever known have had experience
with quite a lot of languages.


I know 10 languages. But I'm not telling you what base that number is :)


There are 10 types of people in the world:  those who know binary and 
those who don't.


;)

~Ethan~
--
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-02 Thread Steve Howell
On Mar 29, 7:03 am, Chris Angelico ros...@gmail.com wrote:
 On Fri, Mar 30, 2012 at 12:44 AM, Nathan Rice

 nathan.alexander.r...@gmail.com wrote:
  We would be better off if all the time that was spent on learning
  syntax, memorizing library organization and becoming proficient with
  new tools was spent learning the mathematics, logic and engineering
  sciences.  Those solve problems, languages are just representations.

 Different languages are good at different things. REXX is an efficient
 text parser and command executor. Pike allows live updates of running
 code. Python promotes rapid development and simplicity. PHP makes it
 easy to add small amounts of scripting to otherwise-static HTML pages.
 C gives you all the power of assembly language with all the
 readability of... assembly language. SQL describes a database request.

 You can't merge all of them without making a language that's
 suboptimal at most of those tasks - probably, one that's woeful at all
 of them.

I agree with you on the overall point, but I think that Python
actually does a fine job of replacing REXX and PHP.  I've used both of
the latter (and, of course, Python).  REXX and PHP are great at what
they do, but I don't think their slight advantages over Python justify
all the weight they carry--incompatible syntax to Python, archaic
libraries, missing modern language features, etc.

It's great to have languages like C and HTML that carve out their own
strong niches.  No argument there.

On the other hand, if you know Python, then having to contend with the
learning curves and idiosyncrasies of Perl and Ruby might feel more
frustrating than empowering.  Like REXX and PHP, Perl and Ruby
arguably have corners where they are more expressive than Python, but
I'd rather have a boring system written in 100% Python than a Ruby/
Python hybrid.

Python should also be a perfectly good superset of Bash Scripting
language.  (To the extent that Python isn't, there's nothing intrinsic
about the language that prevents you from orchestrating processes.)

 I mention SQL because, even if you were to unify all
 programming languages, you'd still need other non-application
 languages to get the job done.


Here I absolutely agree with you.  SQL, to me, is a perfect
illustration of a language that's optimal for a particular task.  Of
course, people still can't resist hiding it behind an ORM.

The web stack is notorious for requiring multilingual juggling.  HTML,
CSS, JS, Python, and SQL are easy enough to juggle, but then you might
also get template languages (with all the interpolation escaping),
config files (XML, YAML, etc.), regexes (possibly multiple dialects),
SQL, testing DSLs (ugh, Cucumber and friends), etc.


 Keep the diversity and let each language focus on what it's best at.

 ChrisA
 who has lots and lots of hammers, so every problem looks like... lots
 and lots of nails.

I know you're just joking here, because you're obviously not
advocating for multiple hammers.  You're advocating for multiple tools
in the toolbox.  Which is good, of course.

I think the problem these days is that the programmer's brain is like
a small toolbox.  Maybe twenty tools fit in the toolbox.  Instead of
filling it up with 20 useful tools, a lot of us have it cluttered up
with ten hammers, when only one of the hammers is what we need for the
nails.



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-02 Thread Steve Howell
On Mar 29, 11:53 am, Devin Jeanpierre jeanpierr...@gmail.com wrote:

 Well, what sort of language differences make for English vs Mandarin?
 Relational algebraic-style programming is useful, but definitely a
 large language barrier to people that don't know any SQL. I think this
 is reasonable. (It would not matter even if you gave SQL python-like
 syntax, the mode of thinking is different, and for a good reason.)


I don't see any fundamental disconnect between SQL thinking and Python
thinking.

List comprehensions are very close to SQL SELECTs semantically, and
not that far off syntactically.

  [row.x for row in foo if x == 3]

  select x from foo where x = 3

Many people can grok the basics of relational algebraic style
programming quite easily, which is why SQL is so popular.  It just
happens that many programming languages up until now have obscured
the idea.

SQL is so strongly associated with RDBMS implementations that people
tend to forget that it makes sense as an abstract language--people
tend to view SQL as a very concrete mechanism for pulling data out of
storage, instead of as a notation for describing the relating and
transforming of sets.


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-04-02 Thread Steve Howell
On Mar 29, 9:38 pm, Nathan Rice nathan.alexander.r...@gmail.com
wrote:

 The mathematics of the 20th century, (from the early 30s onward) tend
 to get VERY abstract, in just the way Joel decries.  Category theory,
 model theory, modern algebraic geometry, topos theory, algebraic graph
 theory, abstract algebras and topological complexes are all very
 difficult to understand because they seem so incredibly abstract, yet
 most of them already have important applications.  I'm 100% positive
 if you just presented Joel with seminal papers in some of those areas,
 he would apply the astronaut rubber stamp, because the material is
 challenging, and he wouldn't get it (I love math, and I've had to read
 some papers 10+ times before they click).

Nathan,

Don't worry too much about Joel Spolsky, and worry even less about
people that allude to him.

Joel Spolksy is an early 21st century businessman.  He's a smart guy
with decent writing skills and semi-interesting thoughts, but he's not
gonna change the world.  He runs his business by promoting pragmatic
processes, writing blogs, procuring comfortable chairs for developers,
renting nice loft space in NYC, and building some useful, but
basically unoriginal, websites and apps.  Everything that Joel Spolsky
has ever said in his blog has already been put into practice 100 times
over by a bunch of similarly pragmatic early 21st century folk.

If you really like math, it is no surprise that you find the current
state of computer programming a bit inelegant.  90% of what
programmers do in the early 21st century is move the data from HERE to
THERE, then another 9% is placing the data in just the right part of
the screen.  I'm exaggerating a bit, but we're still in the
backwaters.  It's all engineering now; it's all breeding the faster
horse.  Or so it seems.

There's a natural ebb and flow to progress.  In the early to mid 20th
century, there were tremendous advances in math, science, and
technology, and maybe it's gonna take a full century or two of playing
around with shit and arguing about stupid shit until the dust finally
settles and we make another quantum jump.




-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-04-02 Thread rusi
On Mar 30, 9:02 pm, Steve Howell showel...@yahoo.com wrote:

 Steven, how do you predict which abstractions are going to be useless?

 There was a time when imaginary numbers were just little toys that the
 mathematicians played around with in their ivory towers.

A non-science/math analogous question:

When Beethoven wrote his last sonatas and quartets they were called
'the abortions of a German idealist' or less charitably the only music
that a stone-deaf man could possibly write
Likewise, Bach's wrote Art of Fugue was regarded as a merely academic
work that codified his knowledge of fugal writing.

It was many decades after their death that everyone began to regard
these as the greatest pieces of music (maybe 50 for Beethoven, almost
100 for Bach).

However for every one Bach/Beethoven there are 100s of fadists who are
great in one generation and garbage-dumped the next.  The encyclopedia
of music I grew up on regarded Schoenberg et al in the Bach/Beethoven
category. Almost certainly a more recent view would not.

So if I side with Steven/Spolsky I would regard a (future) Bach/
Beethoven as an 'abortion.'
If I side with Nathan I may waste my money and life betting on
serialists/cubists and such fashionable but ephemeral fads.

tl;dr version
The usefulness of uber-abstractions is undecidable in the near
timeframe.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-02 Thread Steve Howell
On Mar 29, 9:42 am, Devin Jeanpierre jeanpierr...@gmail.com wrote:
 On Thu, Mar 29, 2012 at 10:03 AM, Chris Angelico ros...@gmail.com wrote:
  You can't merge all of them without making a language that's
  suboptimal at most of those tasks - probably, one that's woeful at all
  of them. I mention SQL because, even if you were to unify all
  programming languages, you'd still need other non-application
  languages to get the job done.

 Not really. You can turn SQL (or something equivalent) into a subset
 of your programming language, like C# does with LINQ, or like Scheme
 does with macros.

I'm glad your moving the discussion away from the fake debate between
diversity (good) and one-language-fits-all (horribly naive at
best, evil at worse).

Of course, there's a middle ground, where (some degree of) language
unification is a completely sane goal, at least worthy of discussion.

SQL is a well-established lingua franca for describing relationships
between sets, or relational algebra.  General-purpose languages like
C#, Scheme, and Python are well suited to subsuming SQL semantics,
and, in fact, they already do, but sometimes they do it a way that's
unnecessarily foreign to people who quite easily grok the basics of
SQL.  The extreme position for language unification would be that
Python completely subsumes SQL under its umbrella as first-class
syntax.  I don't necessarily advocate for that, but I think it's an
interesting thought experiment to ask why it doesn't.  Just to be
clear, I'm talking about SQL as a mechanism to transform sets within
Python itself, not external RDMBS engines (although first-class SQL
integration could also be useful for that as well).

 On the other hand, even similar languages are really hard to run in
 the same VM: imagine the hoops you'd have to jump through to get
 libraries written in Python 2 and 3 to work together.

My take on Python 2 vs. Python 3 is that it's simply a tactical
program to get people to upgrade to Python 3.  The fact that the two
languages can't run on the same VM doesn't really bother me.


 With that in mind, the interesting languages to merge aren't things
 like SQL or regular expressions -- these are so easy to make work with
 programming languages, that we do it all the time already (via string
 manipulation, but first-class syntax would also be easily possible).
 The hard problems are when trying to merge in the semantics of
 languages that only make sense because they have drastically
 different expectations of the world. The example that comes to mind is
 Haskell, which relies incredibly strongly on the lack of side effects.
 How do you merge Haskell and Python?

My view on Haskell and Python is that they should stay alive as
competing paradigms.  I think you're making a useful distinction
between Haskell and SQL.  Neither language is well integrated with
Python.  With Haskell, I think it's for good reason.  With SQL, I
don't quite understand the status quo (beyond the obvious reasons--
maybe we're just not there yet).

 I guess what I really want to say is that the world looks, to me, to
 be more optimistic than so many people think it is. If we wanted to,
 we could absolutely take the best features from a bunch of things.
 This is what C++ does, this is what Scheme does, this is what D does.
 They sometimes do it in different ways, they have varying tradeoffs,
 but this isn't a hard problem except when it is, and the examples you
 mentioned are actually the easy cases. We can merge Python and C,
 while keeping roughly the power of both, it's called Cython.

I love the fact that C dominates all other languages at its level of
abstraction.  I wish this were the case one level up, where you still
have Python, Ruby, JavaScript, PHP, Perl, and others essentially
solving the same classes of problems.  Don't get me wrong--I'm all for
diversity; I'm not saying we should arbitrarily kill off languages,
etc.  I'm just saying that there will be *some* benefit when a clear
victor emerges, and hopefully that will be Python 3.  Whatever
language emerges as the victor, it will probably subsume some of the
best features of the conquered.  To a degree that has already
happened.

 We can
 merge Python and PHP, in that PHP adds nothing incompatible with
 Python technically (it'd be a lot of work and there would be many
 tears shed because it's insane) -- but Python Server Pages probably
 add the feature you want.

PHP is a language that I wish would die off quickly and gracefully.  I
feel like the good things of PHP have already been subsumed into the
ecosystems of stronger programming languages (including Python).

 We could merge SQL and Python, arguably we
 already do via e.g. SQLAlchemy's query API (etc.) or DBAPI2's string
 API. These can all becomes subsets of a language that interoperate
 well with the rest of the language with no problems. These are
 non-issues: the reasons for not doing so are not technical, they are
 political or sociological (e.g., bloat 

Re: Python is readable

2012-04-02 Thread Steve Howell
On Mar 29, 8:36 pm, Steven D'Aprano steve
+comp.lang.pyt...@pearwood.info wrote:

 The Romans had perfectly functioning concrete without any abstract
 understanding of chemistry.

If I ever stumbled upon a technology that proved how useless abstract
thinking was, do you know what I would call it?

Concrete.

Damn, those clever Romans beat me to it by a couple millennia!  And
they had AQUEDUCTS!!!

 [...] Medicine and pharmaceuticals continue to be
 discovered even when we can't predict the properties of molecules.

I know this is really, really far out, and I should probably come back
down to earth, but I can envision some sort of 25th century utopia
where scientists measure the weights and charges and atoms, and
arrange them into some kind of chart, and just BASED ON THE CHART
ALONE, these 25th century scientists might be able to predict
behaviors that have never been observed, just BASED ON ABSTRACT
REASONING.

Whoa!  That's really far out stuff.  Give me some oxygen! (preferably
of the 1s2 2s2 2p4 variety)

(Yep, I get it, the periodic table is about atoms, and medicine/
pharmaceuticals are about molecules, so your point is not invalid.
It's still alchemy, though.)




-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-04-02 Thread Steve Howell
On Mar 30, 1:20 pm, Chris Angelico ros...@gmail.com wrote:

  Really?  Or could it be that algorithms for natural language
  processing that don't fail miserably is a very recent development,
  restricted natural languages more recent still, and pretty much all
  commonly used programming languages are all ~20+ years old?  Could it
  also be that most programmers don't see a lot of incentives to make
  things accessible, since they're already invested in the status quo,
  and they might lose some personal value if programming stopped being
  an arcane practice?

 Totally. That's why we're all still programming in assembly language
 and doing our own memory management, because we would lose a lot of
 personal value if programming stopped being so difficult. If it
 weren't for all these silly new-fangled languages with their automatic
 garbage collection and higher order function handling, we would all be
 commanding much higher salaries.


While I don't subscribe to the conspiracy theory that programmers
invest in arcane practices to preserve personal value [paraphrase of
Nathan], surely you could come up with a better argument than garbage
collection.

Garbage collection was invented over 50 years ago (1959, according to
Wikipedia), and it was implemented in a whole bunch of popular
programming languages in the 90s (and earlier too, if you count
Smalltalk as a popular language).

Python's over 20 years old, and it had garbage collection pretty early
on.  Java's not quite 20 years old, but if Java is your best example
of a new-fangled language with automatic garbage collection, then I
have a hard time appreciating your sarcastic comments.

There hasn't been much progress in programming language design in the
last 20 years.  It's been incremental at best. Nobody's really
thinking outside the box, as far as I can tell.  Please prove me
wrong.

It's true that we've moved past assembly language.


-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-02 Thread rusi
On Mar 30, 4:37 am, Devin Jeanpierre jeanpierr...@gmail.com wrote:
 On Thu, Mar 29, 2012 at 3:50 PM, Nathan Rice

 nathan.alexander.r...@gmail.com wrote:
  Well, a lisp-like language.  I would also argue that if you are using
  macros to do anything, the thing you are trying to do should classify
  as not natural in lisp :)

 You would run into disagreement. Some people feel that the lisp
 philosophy is precisely that of extending the language to do anything
 you want, in the most natural way.

 At least, I disagree, but my lisp thoughts are the result of
 indoctrination of the Racket crowd.

I guess Paul Graham would likewise disagree.
See 7,8,9 in http://paulgraham.com/diff.html
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-02 Thread Steve Howell
On Mar 31, 1:13 pm, Tim Rowe digi...@gmail.com wrote:

 I know 10 languages. But I'm not telling you what base that number is :)


Well, that means you know at least two programming languages, which
puts you ahead of a lot of people. :)

obligatory joke
Some folks, when confronted with a problem, decide to solve it with
binary numbers.  And then they have 10 problems.
/try the fish

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-04-02 Thread alex23
On Mar 31, 2:02 am, Steve Howell showel...@yahoo.com wrote:
 Steven, how do you predict which abstractions are going to be useless?

A useless abstraction is one that does nothing to simplify a problem
*now*:

 being so fixated on over-arching abstract
 concepts that, far from those abstractions making it easier to solve the
 problems they are being paid to solve, they actually make them harder

An abstraction is also useless if it's so clever that it isn't readily
understandable to an average developer; architecture astronauts don't
tend to be big on sticking around to provide ongoing support.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-04-02 Thread Steve Howell
On Mar 30, 11:25 pm, Lie Ryan lie.1...@gmail.com wrote:
 On 03/21/2012 01:44 PM, Steve Howell wrote:

  Also, don't they call those thingies object for a reason? ;)

 A subject is (almost?) always a noun, and so a subject is also an object.

It's true that words that can act as a subject can also act like
objects in other sentences.  That doesn't really answer my question,
though.  Why do we call programming objects objects instead of
calling them subjects or nouns?
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-04-02 Thread Steve Howell
On Apr 1, 8:30 pm, alex23 wuwe...@gmail.com wrote:
 On Mar 31, 2:02 am, Steve Howell showel...@yahoo.com wrote:

  Steven, how do you predict which abstractions are going to be useless?

 A useless abstraction is one that does nothing to simplify a problem
 *now*:

That's the very definition of short-sighted thinking.  If it doesn't
do anything *now*


  being so fixated on over-arching abstract
  concepts that, far from those abstractions making it easier to solve the
  problems they are being paid to solve, they actually make them harder

 An abstraction is also useless if it's so clever that it isn't readily
 understandable to an average developer; architecture astronauts don't
 tend to be big on sticking around to provide ongoing support.

You are careless and sloppy with your quoting here.  I didn't say the
above; you should give proper attribution.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-02 Thread alex23
On Mar 30, 3:37 pm, Nathan Rice nathan.alexander.r...@gmail.com
wrote:
 We live in a world where the tools that are used are based on
 tradition (read that as backwards compatibility if it makes you feel
 better) and as a mechanism for deriving personal identity.  The world
 is backwards and retarded in many, many ways, this problem is
 interesting to me because it actually cuts across a much larger tract
 than is immediately obvious.

Do you produce commercial code in a team? Because going by your
absolutist bullshit here, it certainly doesn't sound like it.

When I join an organisation that requires language A as all of its
systems are written in it, is that 'tradition' or 'personal identity'?
How is 'compatibility' - either with existing systems or existing
*developers* - a backwards and retarded approach to complex
problems?

If I've chosen language A because some aspect of its syntax maps
better onto my mind (or for _whatever_ reason that makes individuals
prefer one language to another), and you've chosen language B: who
gets to decide which is the 'superior' language, which is the 'better'
mapping etc?

You're arguing for a top-down centralised approach to language
development that just will _never_ exist, simply because it cannot. If
you don't accept that, I believe there's a fascinating fork called
Python 4000 where your ideas would be readily adopted.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-04-02 Thread alex23
On Mar 31, 6:30 am, Neil Cerutti ne...@norwich.edu wrote:
 See, for example, Inform 7, which translates a subset of English
 into Inform 6 code. I never thought too deeply about why I
 disliked it, assuming it was because I already knew Inform 6.

I've always respected Inform 7 while being also unwilling to use it.
When you're trying to make your language grammar a subset of your data
grammar and then combine them interchangeably, it can be unclear about
where the error lies, even if the combination is done in prescribed
and restricted ways. Whereas keywords are key words: they give texture
to source code, like visual braille.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-02 Thread Chris Angelico
On Fri, Mar 30, 2012 at 2:48 AM, Steve Howell showel...@yahoo.com wrote:
 I agree with you on the overall point, but I think that Python
 actually does a fine job of replacing REXX and PHP.  I've used both of
 the latter (and, of course, Python).  REXX and PHP are great at what
 they do, but I don't think their slight advantages over Python justify
 all the weight they carry--incompatible syntax to Python, archaic
 libraries, missing modern language features, etc.

I think you're probably right about REXX, mainly because it's somewhat
old now. It was an awesome language when I first met it back in the
1990s; it tied in very nicely with OS/2, it was (and is) easy to
extend and embed with C, it had excellent GUI facilities (as long as
you don't need it to be cross-platform). But today, REXX is somewhat
outclassed. I don't recommend it to people for most tasks, unless
they're actually on OS/2 (in which case they probably know it
already). Unicode support and cross-platform GUI toolkits would
probably be REXX's two biggest lacks.

As to PHP? I don't think it's great at what [it] [does], frankly. At
least, it's not great at what it's often used for. PHP is adequate as
a variant of HTML that allows scripting, but it's usually used today
as though it were a CGI script, and for that it's less than optimal.
For instance, you can't have an include file without it also being an
entry point of its own (eg someone could go to
http://www.example.com/common_functions.php), so you need code to
protect against that. Huge frameworks have such code peppered
throughout.

(As a side point, I don't believe that a web server's CGI scripts
should be updated simply by writing to the disk. It's pretty easy to
get non-atomicity problems when you have a page and its include file.
There ARE other options, but I don't know of any efficient ways to do
it in Python.)

 Python should also be a perfectly good superset of Bash Scripting
 language.  (To the extent that Python isn't, there's nothing intrinsic
 about the language that prevents you from orchestrating processes.)

Hmm... How do you pipe one command's output into another's input using
Python? It's not nearly as clean as it is in bash.

 I think the problem these days is that the programmer's brain is like
 a small toolbox.  Maybe twenty tools fit in the toolbox.  Instead of
 filling it up with 20 useful tools, a lot of us have it cluttered up
 with ten hammers, when only one of the hammers is what we need for the
 nails.

Maybe. But you can also have a little catalogue in there that reminds
you of the tools you have in your shed. If you keep that catalogue up
to date and accurate, you can hunt down those tools you seldom use
when you need them.

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-02 Thread Chris Angelico
On Sun, Apr 1, 2012 at 6:23 AM, Steve Howell showel...@yahoo.com wrote:
 On Mar 31, 1:13 pm, Tim Rowe digi...@gmail.com wrote:

 I know 10 languages. But I'm not telling you what base that number is :)


 Well, that means you know at least two programming languages, which
 puts you ahead of a lot of people. :)

That's enough to use the phone support code word. (I'm of the opinion
that you shouldn't be allowed to use it unless you are qualified to
respond to it.)

ChrisA
who knows 806 comic strips
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-04-02 Thread Chris Angelico
On Sat, Mar 31, 2012 at 4:07 PM, Steve Howell showel...@yahoo.com wrote:
 On Mar 30, 1:20 pm, Chris Angelico ros...@gmail.com wrote:

 Totally. That's why we're all still programming in assembly language
 and doing our own memory management, because we would lose a lot of
 personal value if programming stopped being so difficult. If it
 weren't for all these silly new-fangled languages with their automatic
 garbage collection and higher order function handling, we would all be
 commanding much higher salaries.


 While I don't subscribe to the conspiracy theory that programmers
 invest in arcane practices to preserve personal value [paraphrase of
 Nathan], surely you could come up with a better argument than garbage
 collection.

GC is to programming what running water or a fridge is to a kitchen.
It isn't exactly new, in fact you probably would expect it even in a
fairly old kitchen, but you still wouldn't want to give it up. A
somewhat newer example would be the ability to pass higher-order
objects around - functions, mappings, lists, etc - and the basic
concept that an expression resulting in (or function returning) an
object is identical to a variable containing that object. Not every
language supports that.

 There hasn't been much progress in programming language design in the
 last 20 years.  It's been incremental at best. Nobody's really
 thinking outside the box, as far as I can tell.  Please prove me
 wrong.

 It's true that we've moved past assembly language.

Twenty years? That would almost certainly include solid Unicode
support. Anything that dates back to 1992 is unlikely to truly
acknowledge the difference between bytes and characters. That's
probably incremental, but a fairly big increment.

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-02 Thread Tim Chase

PHP is a language that I wish would die off quickly and
gracefully.  I feel like the good things of PHP have already
been subsumed into the ecosystems of stronger programming
languages (including Python).


The one killer feature PHP has to offer over other languages: 
ease of efficient deployment on cheap/free hosting.  When I go to 
deploy Python projects, it ends up either being as slow CGI on 
the cheap/free hosting, or it ends up needing 
WSGI/gunicorn/whatever on a more expensive service (whether 
shared, VPS, or full hardware).


I dream of a day that deploying Python/Django apps is as 
cheap/easy as deploying PHP code.


-tkc



--
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-02 Thread Steve Howell
On Apr 2, 2:50 pm, Chris Angelico ros...@gmail.com wrote:
 On Fri, Mar 30, 2012 at 2:48 AM, Steve Howell showel...@yahoo.com wrote:
  I agree with you on the overall point, but I think that Python
  actually does a fine job of replacing REXX and PHP.  I've used both of
  the latter (and, of course, Python).  REXX and PHP are great at what
  they do, but I don't think their slight advantages over Python justify
  all the weight they carry--incompatible syntax to Python, archaic
  libraries, missing modern language features, etc.

 I think you're probably right about REXX, mainly because it's somewhat
 old now. It was an awesome language when I first met it back in the
 1990s; it tied in very nicely with OS/2, it was (and is) easy to
 extend and embed with C, it had excellent GUI facilities (as long as
 you don't need it to be cross-platform). But today, REXX is somewhat
 outclassed. I don't recommend it to people for most tasks, unless
 they're actually on OS/2 (in which case they probably know it
 already). Unicode support and cross-platform GUI toolkits would
 probably be REXX's two biggest lacks.

 As to PHP? I don't think it's great at what [it] [does], frankly. At
 least, it's not great at what it's often used for. PHP is adequate as
 a variant of HTML that allows scripting, but it's usually used today
 as though it were a CGI script, and for that it's less than optimal.
 For instance, you can't have an include file without it also being an
 entry point of its own (eg someone could go 
 tohttp://www.example.com/common_functions.php), so you need code to
 protect against that. Huge frameworks have such code peppered
 throughout.

 (As a side point, I don't believe that a web server's CGI scripts
 should be updated simply by writing to the disk. It's pretty easy to
 get non-atomicity problems when you have a page and its include file.
 There ARE other options, but I don't know of any efficient ways to do
 it in Python.)

  Python should also be a perfectly good superset of Bash Scripting
  language.  (To the extent that Python isn't, there's nothing intrinsic
  about the language that prevents you from orchestrating processes.)

 Hmm... How do you pipe one command's output into another's input using
 Python? It's not nearly as clean as it is in bash.


For pipes, I'd still call out to bash.  I know that's cheating, but
the idea is that Python can wrap all the good parts of bash while
still allowing you to use Python's more modern syntax, standard
library, etc.



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-02 Thread Nathan Rice
On Sun, Apr 1, 2012 at 11:18 PM, alex23 wuwe...@gmail.com wrote:
 On Mar 30, 3:37 pm, Nathan Rice nathan.alexander.r...@gmail.com
 wrote:
 We live in a world where the tools that are used are based on
 tradition (read that as backwards compatibility if it makes you feel
 better) and as a mechanism for deriving personal identity.  The world
 is backwards and retarded in many, many ways, this problem is
 interesting to me because it actually cuts across a much larger tract
 than is immediately obvious.

 Do you produce commercial code in a team? Because going by your
 absolutist bullshit here, it certainly doesn't sound like it.

Think of me like the Wolf, the cleaner in pulp fiction that Marcellis
Wallis calls in to take care of the mess when Jules accidentally blows
a kid's brains out in the back of a car.  I get called in when my
skills are needed, and when the mess has been handled and things are
back to normal I take my leave.

 When I join an organisation that requires language A as all of its
 systems are written in it, is that 'tradition' or 'personal identity'?
 How is 'compatibility' - either with existing systems or existing
 *developers* - a backwards and retarded approach to complex
 problems?

I don't care what people do related to legacy systems.  There will
always be a COBOL.  I do care about programmers that are too lazy to
learn, and would be happy to ignore the fact that programming is hard
for most people to learn, so they can continue not learning.  Those
programmers are scumbags.

Just don't let me hear you complaining because some syntax is not C
like enough for you.  Whenever I hear that I want to strangle the
self-serving 'tard that wrote it.  When I see people defending C
like syntax as optimal or somehow much more expressive, that makes me
doubly irritated.  These are the people who are selfishly defending
the status quo because they're invested.  If you're going to be
selfish and inconsiderate at least be honest about it, rather than
pretending that one of the earliest languages somehow got almost
everything right and should be the basis for new languages till the
end of time.  This goes for most of the ALGOL derived languages.  I
don't have a problem if you know your language well and are happy
using it, that's great.  Don't try to delude people that our modern
ALGOL derivatives are the best possible way to model knowledge
(including process knowledge) to a computer, because that is a lie.

 If I've chosen language A because some aspect of its syntax maps
 better onto my mind (or for _whatever_ reason that makes individuals
 prefer one language to another), and you've chosen language B: who
 gets to decide which is the 'superior' language, which is the 'better'
 mapping etc?

You should be able to live in your reality if you want, as long that
doesn't impinge on others.  Of course, if you disagree on basic
grammar, then I would have to ask you, do you disagree about English
grammar, or have you accepted it so that you can communicate with
people?  This is why I advocate following English grammar closely for
syntax - people have accepted it and don't make a big deal, and it is
the way we represent information already.

 You're arguing for a top-down centralised approach to language
 development that just will _never_ exist, simply because it cannot. If
 you don't accept that, I believe there's a fascinating fork called
 Python 4000 where your ideas would be readily adopted.

You completely missed my point.  In fact, my argument is for a bottom
up approach, with a meeting point which is much closer than the
machine code which is currently used.  However you want to represent
it, the knowledge is the same, and that is what matters.  We need to
get past the idea of different, incompatible languages, and settle on
a common knowledge representation format that underlies all languages,
and is compatible.  If you want to make an alex23 DSL where up is down
and inside is upside down, go for it, just as long as it is
represented in a sensible set of semantic primes that I can transform
to whatever reality I want.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-04-02 Thread alex23
On Apr 3, 2:55 pm, Nathan Rice nathan.alexander.r...@gmail.com
wrote:
 I don't care what people do related to legacy systems.

And that's what earns you the label 'architecture astronaut'. Legacy
systems are _part_ of the problem; it's very easy to  hold to a purist
approach when you ignore the bulk of the domain that causes the
issues. There's _never_ going to be an InfoTech3k where we just stop
supporting older code.

 I do care about programmers that are too lazy to
 learn, and would be happy to ignore the fact that programming is hard
 for most people to learn, so they can continue not learning.  Those
 programmers are scumbags.

Wait, what?

Programmers are both too lazy to learn and yet somehow happy that
the skills they've acquired are too hard for most people to learn?
So how did they learn them?

And they're also somehow lazy because they have to learn multiple
languages to be effective,  rather than one mythical ur-language?

In my 20 years as a software developer, I have _never_ encountered
anyone trying to deliberately expand the knowledge gap. This isn't a
priesthood.

 Just don't let me hear you complaining because some syntax is not C
 like enough for you.  Whenever I hear that I want to strangle the
 self-serving 'tard that wrote it.  When I see people defending C
 like syntax as optimal or somehow much more expressive, that makes me
 doubly irritated.  These are the people who are selfishly defending
 the status quo because they're invested.

Syntax is never the issue, it's the deeper semantics. Is the scoping
of one C-like language the same as C? How does it differ? Why does it
differ? Is the difference a fundamental implementation issue that you
really need to know before you actually grok the language? Are
functions first-class objects? Are they actual objects or some kind of
magical stub? Can you extend those objects with properties? etc etc

Every language tackles _so many_ things differently. It's not lazy to
say that you prefer something to resemble/be based on a language you
have experience with, that's human nature. If you're insistent that
your non-typical syntax is so much better, the onus is on you to prove
it, not to insist that the lack of uptake is 'laziness'.

And one again: code is _communication_. Not having to understand new
optimal patterns for every single language is a Good Thing.

 Don't try to delude people that our modern
 ALGOL derivatives are the best possible way to model knowledge
 (including process knowledge) to a computer, because that is a lie.

Um, okay, I'll stop doing that...not that I've ever seen anyone make
that claim...

A large part of what makes languages popular _is their popularity_. In
many ways, ALGOL is English to your hypothetical language's Lojban.
You can argue until the end of time for the superiority of Lojban due
to it's lack of ambiguity, it's not going to affect it's acquisition
at all.

 You should be able to live in your reality if you want, as long that
 doesn't impinge on others.  Of course, if you disagree on basic
 grammar, then I would have to ask you, do you disagree about English
 grammar, or have you accepted it so that you can communicate with
 people?  This is why I advocate following English grammar closely for
 syntax - people have accepted it and don't make a big deal, and it is
 the way we represent information already.

And programmers have accepted ALGOL and don't etc

The idea of coding in English just fills me with horror and dread.
COBOL died for a reason.

  You're arguing for a top-down centralised approach to language
  development that just will _never_ exist, simply because it cannot. If
  you don't accept that, I believe there's a fascinating fork called
  Python 4000 where your ideas would be readily adopted.

 You completely missed my point.  In fact, my argument is for a bottom
 up approach, with a meeting point which is much closer than the
 machine code which is currently used.

You missed my point; I was referring more to the _adoption_ of your ur-
language. The only way to push this is to force it on everyone.

 However you want to represent
 it, the knowledge is the same, and that is what matters.  We need to
 get past the idea of different, incompatible languages, and settle on
 a common knowledge representation format that underlies all languages,
 and is compatible.  If you want to make an alex23 DSL where up is down
 and inside is upside down, go for it, just as long as it is
 represented in a sensible set of semantic primes that I can transform
 to whatever reality I want.

So effectively for any given project I'd need to know: the underlying
representation (because we have to be able to discuss _something_ as a
team), my DSL, how my DSL transforms to the underlying representation,
and to be really effective, every team member's DSL and how it
transforms. Because _no one_ on my team works alone, debugs alone 100%
of the time.

How do I share cool patterns? Show them the underlying 

Re: Python is readable

2012-03-31 Thread Lie Ryan

On 03/18/2012 12:36 PM, Steven D'Aprano wrote:

On Sat, 17 Mar 2012 20:59:34 +0100, Kiuhnm wrote:
In the second example, most English speakers would intuit that print(i)
prints i, whatever i is.


There are two points where the code may be misunderstood, a beginner may 
think that print i prints to the inkjet printer (I remembered 
unplugging my printer when I wrote my first BASIC program for this 
reason); and the possible confusion of whether print i prints the 
letter i or the content of variable i. (Fortunately, this confusion 
are easily resolved when I run the code and see the result on-screen 
instead of a job on the print spooler)


(ironically, although print is nowadays a programming jargon for 
outputting to screen, but in the old dark ages, people used to use the 
print statement to print to paper in their old terminal)


--
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-31 Thread Lie Ryan

On 03/21/2012 03:55 AM, Nathan Rice wrote:

In mathematics, when you perform global optimization you must be
willing to make moves in the solution space that may result in a
temporary reduction of your optimality condition.  If you just perform
naive gradient decent, only looking to the change that will induce the
greatest immediate improvement in optimality, you will usually end up
orbiting around a solution which is not globally optimal.  I mention
this because any readability or usability information gained using
trained programmers is simultaneously measuring the readability or
usability and its conformance to the programmer's cognitive model of
programming.  The result is local optimization around the
current-paradigm minimum.  This is why we have so many nearly
identical curly brace C-like languages.


I think you've just described that greedy algorithm can't always find 
the globally optimal solution.


--
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-31 Thread Lie Ryan

On 03/21/2012 01:44 PM, Steve Howell wrote:

Also, don't they call those thingies object for a reason? ;)


A subject is (almost?) always a noun, and so a subject is also an object.

--
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-31 Thread Chris Angelico
On Sat, Mar 31, 2012 at 10:01 AM, Nathan Rice
nathan.alexander.r...@gmail.com wrote:
 It seems to me that Indented blocks of text are used pretty frequently
 to denote definition bodies, section subordinate paragraphs and
 asides.  The use of the colon seems pretty natural too.  Parentheses
 are fairly natural for small asides.  The notion of character
 delimiters for large sections of text is actually pretty unnatural
 with the exception of  quotes.

Perhaps in formal written English, but not in spoken, and often not in
casual writing either. Play around with my actual example, an if
clause, and see where the punctuation goes in English - and how easily
you can construct ambiguous sentences.

 I don't like declarations, my personal preference is to have typed
 signatures, and implicit declaration with type inference elsewhere.  I
 view it as a matter of personal preference though, the result should
 be the same, and it should be possible to view the code either way.

I'm not sure what you mean by typed signatures, can you elaborate please?

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-31 Thread Nathan Rice
On Sat, Mar 31, 2012 at 2:15 AM, Lie Ryan lie.1...@gmail.com wrote:
 On 03/21/2012 03:55 AM, Nathan Rice wrote:

 snip

 I think you've just described that greedy algorithm can't always find the
 globally optimal solution.

Right.  Using gradient descent on an algebraic surface is probably the
most natural example of why this is the case, since balls rolling down
a surface from a starting point to the bottom of a bowl is an exact
analogy.

On Sat, Mar 31, 2012 at 4:05 AM, Chris Angelico ros...@gmail.com wrote:
 On Sat, Mar 31, 2012 at 10:01 AM, Nathan Rice
 nathan.alexander.r...@gmail.com wrote:
 It seems to me that Indented blocks of text are used pretty frequently
 to denote definition bodies, section subordinate paragraphs and
 asides.  The use of the colon seems pretty natural too.  Parentheses
 are fairly natural for small asides.  The notion of character
 delimiters for large sections of text is actually pretty unnatural
 with the exception of  quotes.

 Perhaps in formal written English, but not in spoken, and often not in
 casual writing either. Play around with my actual example, an if
 clause, and see where the punctuation goes in English - and how easily
 you can construct ambiguous sentences.

Sure

an event has occurred recently if it occurred in the last time step.

if xyz has occurred recently, that implies abc will occur in the next time step.

when event abc occurs, all unbound floops become bound, and at most
three newly bound floops are eaten by blargs.

blargs that have not eaten in the last 3 time steps eat before blargs
that have eaten in those time steps.

Notice I don't talk about HOW anything is done, just the logic of what
is happening.  The computer should be capable of making an inventory
of exactly what it will need to do given the statements I have made,
and choose the best data structures and algorithms for the job.  If we
are in undecidable/halting problem territory (indefinite recursion)
then the computer should at least be nice enough to tell you it is
confused and would like some help.

 I don't like declarations, my personal preference is to have typed
 signatures, and implicit declaration with type inference elsewhere.  I
 view it as a matter of personal preference though, the result should
 be the same, and it should be possible to view the code either way.

 I'm not sure what you mean by typed signatures, can you elaborate please?

Just like the standard way in the Haskell community.  To demonstrate
using Python annotations...

def myfunc(Sequence : a, Integral : b, Map : c) - Sequence:
...

Given starting types and ending types, you can correctly infer some
VERY complex internal types. Haskell will let you omit signature types
as well, but that is really a bad idea because they add readability
and you will have to add them often anyhow if you are dealing with
complex types.  Better to be consistent...

As a funny aside, people usually provide input type and return type
annotations to python functions, as part of the docstring (for
sphinx).

To be honest, I like having types baked into my code, though not
nominal types (the sort that is an A because it was declared as an A
or a subclass of A), but rather structural types (i.e. Abstract base
classes, or Java interfaces, if you didn't have to add the implements
...).  I don't like having to verbosely tell the computer about the
types of everything I'm going to use, I only care that it gives me the
output I want if I give it some agreed upon input.  It should be smart
enough to check the little things.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-31 Thread MRAB

On 31/03/2012 06:56, Lie Ryan wrote:

On 03/18/2012 12:36 PM, Steven D'Aprano wrote:

 On Sat, 17 Mar 2012 20:59:34 +0100, Kiuhnm wrote:
 In the second example, most English speakers would intuit that print(i)
 prints i, whatever i is.


There are two points where the code may be misunderstood, a beginner may
think that print i prints to the inkjet printer (I remembered
unplugging my printer when I wrote my first BASIC program for this
reason); and the possible confusion of whether print i prints the
letter i or the content of variable i. (Fortunately, this confusion
are easily resolved when I run the code and see the result on-screen
instead of a job on the print spooler)

(ironically, although print is nowadays a programming jargon for
outputting to screen, but in the old dark ages, people used to use the
print statement to print to paper in their old terminal)


I remember a review of a machine back in the early 1980s or late 1970s.
The machine used BASIC, but the reviewer was surprised when the printer
sprang into life. PRINT meant send to printer; in order to 'print'
to the screen you used DISPLAY. (The more common method was to use
LPRINT for sending to the printer.)
--
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-31 Thread Tim Rowe
On 22 March 2012 19:14, Chris Angelico ros...@gmail.com wrote:

 In any case, though, I agree that there's a lot of people
 professionally writing code who would know about the 3-4 that you say.
 I'm just not sure that they're any good at coding, even in those few
 languages. All the best people I've ever known have had experience
 with quite a lot of languages.

I know 10 languages. But I'm not telling you what base that number is :)

-- 
Tim Rowe
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-31 Thread David Robinow
On Sat, Mar 31, 2012 at 4:13 PM, Tim Rowe digi...@gmail.com wrote:

 I know 10 languages. But I'm not telling you what base that number is :)
 The fact that you know there are bases other than 10 puts you in the
top half of the candidates already!
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-31 Thread Hannu Krosing
On Sat, 2012-03-31 at 18:55 -0400, David Robinow wrote:
 On Sat, Mar 31, 2012 at 4:13 PM, Tim Rowe digi...@gmail.com wrote:
 
  I know 10 languages. But I'm not telling you what base that number is :)
  The fact that you know there are bases other than 10 puts you in the
 top half of the candidates already!

I'm sure he really had base 10 in mind *

Hannu


* But as this 10 is in his chosen base, it is a tautology  :)



-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-30 Thread Steven D'Aprano
On Fri, 30 Mar 2012 00:38:26 -0400, Nathan Rice wrote:

 He did no such thing. I challenge you to find me one place where Joel
 has *ever* claimed that the very notion of abstraction is
 meaningless or without use.
 [snip quote]
 To me, this directly indicates he views higher order abstractions
 skeptically,

 Yes he does, and so we all should, but that's not the claim you made.
 You stated that he fired the broadsides at the very notion of
 abstraction. He did no such thing. He fired a broadside at (1)
 software hype based on (2) hyper-abstractions which either don't solve
 any problems that people care about, or don't solve them any better
 than more concrete solutions.
 
 Mathematics is all about abstraction.  There are theories and structures
 in mathematics that have probably gone over a hundred years before being
 applied.  As an analogy, just because a spear isn't useful while farming
 doesn't mean it won't save your life when you venture into the woods and
 come upon a bear.

A spear is a concrete application of the principle of leverage, not an 
abstraction. I also point out leverage was discovered experimentally long 
before anyone had an abstraction for it.

In any case, so what? Who is saying that mathematics is useless? Not me, 
and not Joel Spolksy. You are hunting strawmen with your non-abstract 
spear.

Spolsky has written at least three times about Architecture Astronauts, 
and made it abundantly clear that the problem with them is that they 
don't solve problems, they invent overarching abstractions that don't do 
anything useful or important, and hype them everywhere.

http://www.joelonsoftware.com/articles/fog18.html
http://www.joelonsoftware.com/items/2005/10/21.html
http://www.joelonsoftware.com/items/2008/05/01.html

Jeff Attwood provides a simple test for the difference between a useful 
abstraction and an Architecture Astronaut hyper-abstraction:

Does it solve a useful problem?

http://www.codinghorror.com/blog/2004/12/it-came-from-planet-architecture.html

You keep reading this as an assault on abstract mathematics, science and 
knowledge for its on sake. It isn't any of these things.

If I'm paid to solve a problem, and instead I build an abstraction that 
doesn't help solve the problem, then I'm guilty of doing architecture 
astronauting.

 
 and assumes because he does not see meaning in them, they don't hold
 any meaning.

 You are making assumptions about his mindset that not only aren't
 justified by his comments, but are *contradicted* by his comments. He
 repeatedly describes the people coming up with these hyper-abstractions
 as great thinkers, clever thinkers, etc. who are seeing patterns in
 what people do. He's not saying that they're dummies. He's saying that
 they're seeing patterns that don't mean anything, not that the patterns
 aren't there.
 
 He is basically saying they are too clever for their own good, as a
 result of being fixated upon purely intellectual constructs.

Yes, and he is right to do so, because that is the characteristic of the 
Architecture Astronaut: being so fixated on over-arching abstract 
concepts that, far from those abstractions making it easier to solve the 
problems they are being paid to solve, they actually make them harder.

Good abstractions enable problems to be solved. Bad abstractions don't.

If I ask you to build me a website, I probably don't want a website-
builder, I certainly don't want a website-builder-generator, and I 
absolutely do not want you to generalise the concept of a compiler and 
create a whole new abstract language for describing meta-compilers so 
that I can create a brand new programming language for generating meta-
compilers that build compilers that will build factories for building 
website generators so I can make my own website in just three easy steps 
(the simplest one of which is about twice as difficult as just building 
the website would have been).

If you are being paid to build abstractions in the ivory tower, on the 
chance that one in a thousand abstractions turns out to be a game 
changer, or just because of a love of pure knowledge, that's great. I 
love abstract mathematics too. I read maths in my free time, you won't 
find me saying that it is bad or useless or harmful. But does it solve 
real problems?

Well, of course it does, and often in a most surprising places. But 
that's because out of the hundred thousand abstractions, we see the 
hundred that actually solve concrete problems. The other 99,999 exist 
only in forgotten journals, or perhaps the odd book here or there.

This is all well and good. It's not meant as an attack on mathematics. 
You can't tell ahead of time which abstractions will solve real problems. 
*Somebody* has to be thinking about ways that spherical camels can pass 
through the eye of a 17-dimensional needle, because you never know when 
somebody else will say, Hey, that's just what we need to make low-fat 
chocolate ice cream that doesn't taste like 

Re: Python is readable

2012-03-30 Thread Nathan Rice
 Mathematics is all about abstraction.  There are theories and structures
 in mathematics that have probably gone over a hundred years before being
 applied.  As an analogy, just because a spear isn't useful while farming
 doesn't mean it won't save your life when you venture into the woods and
 come upon a bear.

 A spear is a concrete application of the principle of leverage, not an
 abstraction. I also point out leverage was discovered experimentally long
 before anyone had an abstraction for it.

And an analogy is a device to demonstrate the fundamental character of
an argument in a different context.

 In any case, so what? Who is saying that mathematics is useless? Not me,
 and not Joel Spolksy. You are hunting strawmen with your non-abstract
 spear.

I don't think it is a strawman.  He decries things that aren't
immediately useful.  That describes almost all pure math.  If he had
excluded things that have some characteristic of truth, and just
talked about overly general systems, I might agree with him.

 Spolsky has written at least three times about Architecture Astronauts,
 and made it abundantly clear that the problem with them is that they
 don't solve problems, they invent overarching abstractions that don't do
 anything useful or important, and hype them everywhere.

I believe in the idea of things should be as simple as possible, but
not simpler.  Programming as it currently exists is absolutely
convoluted.  I am called on to help people learn to program from time
to time, and I can tell you that we still have a LONG way to go before
programming approaches a global optimum in either the semantic or
syntactic space.  Never mind configuring a build or anything else
related to projects.  The whole setup really is horrible, and I'm
convinced that most of the people who are capable of changing this are
more concerned about their personal investment in the way things are
than helping others.  There are a few exceptions like Alan Kay, but
mostly people want to graft shortcuts on to what already exists.

 You keep reading this as an assault on abstract mathematics, science and
 knowledge for its on sake. It isn't any of these things.

I never said it was an attack on science.  Scientists don't really do
abstraction, they explain observations.  Mathematicians are the ones
who discover truth that may be completely disconnected from reality.

 If I'm paid to solve a problem, and instead I build an abstraction that
 doesn't help solve the problem, then I'm guilty of doing architecture
 astronauting.

If I ignore everything Joel wrote and just use that definition, I
agree with you.

 He is basically saying they are too clever for their own good, as a
 result of being fixated upon purely intellectual constructs.

 Yes, and he is right to do so, because that is the characteristic of the
 Architecture Astronaut: being so fixated on over-arching abstract
 concepts that, far from those abstractions making it easier to solve the
 problems they are being paid to solve, they actually make them harder.

 Good abstractions enable problems to be solved. Bad abstractions don't.

 If I ask you to build me a website, I probably don't want a website-
 builder, I certainly don't want a website-builder-generator, and I
 absolutely do not want you to generalise the concept of a compiler and
 create a whole new abstract language for describing meta-compilers so
 that I can create a brand new programming language for generating meta-
 compilers that build compilers that will build factories for building
 website generators so I can make my own website in just three easy steps
 (the simplest one of which is about twice as difficult as just building
 the website would have been).

Again, I follow the principle of everything should be as simple as
possible, but no simpler.  I have in the past built website builders
rather than build websites (and done a similar thing in other cases),
but not because I am trying to discover some fundamental truth of
website building, because that would be bullshit.  I did it because
building websites (or whatever else it was) is a boring, tedious
problem, and a website builder, while being more work, is an engaging
problem that requires thought.  I enjoy challenging myself.

 If you are being paid to build abstractions in the ivory tower, on the
 chance that one in a thousand abstractions turns out to be a game
 changer, or just because of a love of pure knowledge, that's great. I
 love abstract mathematics too. I read maths in my free time, you won't
 find me saying that it is bad or useless or harmful. But does it solve
 real problems?

You forget that even abstractions that never directly get turned into
something real are almost invariably part of the intellectual
discourse that leads to real things.

 Well, of course it does, and often in a most surprising places. But
 that's because out of the hundred thousand abstractions, we see the
 hundred that actually solve concrete problems. The 

Re: Python is readable

2012-03-30 Thread Chris Angelico
On Sat, Mar 31, 2012 at 12:46 AM, Nathan Rice
nathan.alexander.r...@gmail.com wrote:
 I believe in the idea of things should be as simple as possible, but
 not simpler.  Programming as it currently exists is absolutely
 convoluted.  I am called on to help people learn to program from time
 to time, and I can tell you that we still have a LONG way to go before
 programming approaches a global optimum in either the semantic or
 syntactic space.

Aside from messes of installing and setting up language
interpreters/compilers, which are averted by simply having several
pre-installed (I think my Linux boxes come with some Python 2 version,
Perl, bash, and a few others), that isn't really the case. Starting a
Python script is easy. Programming gets convoluted only when, and to
the extent that, the task does.

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


RE: Python is readable

2012-03-30 Thread Prasad, Ramit
  My aunt makes the best damn lasagna you've ever tasted without any
  overarching abstract theory of human taste. And if you think that
 quantum
  mechanics is more difficult than understanding human perceptions of
  taste, you are badly mistaken.
 
 Taste is subjective, and your aunt probably started from a good recipe
 and tweaked it for local palates.  That recipe could easily be over a
 hundred years old.  An overarching mathematical theory of human
 taste/mouth perception, if such a silly thing were to exist, would be
 able to generate new recipes that were perfect for a given person's
 tastes very quickly.
 
 Additionally, just to troll this point some more (fun times!), I would
 argue that there is an implicit theory of human taste (chefs refer to
 it indirectly as gastronomy) that is very poorly organized and lacks
 any sort of scientific rigor.  Nonetheless, enough empirical
 observations about pairings of flavors, aromas and textures have been
 made to guide the creation of new recipes.  Gastronomy doesn't need to
 be organized or rigorous because fundamentally it isn't very
 important.

I cannot live without eating, I can live just fine without math. 
Your opinion that gastronomy is fundamentally unimportant is 
fundamentally flawed. 

  In any case, Spolsky is not making a general attack on abstract science.
  Your hyperbole is completely unjustified.
 
 The mathematics of the 20th century, (from the early 30s onward) tend
 to get VERY abstract, in just the way Joel decries.  Category theory,
 model theory, modern algebraic geometry, topos theory, algebraic graph
 theory, abstract algebras and topological complexes are all very
 difficult to understand because they seem so incredibly abstract, yet
 most of them already have important applications.  I'm 100% positive
 if you just presented Joel with seminal papers in some of those areas,
 he would apply the astronaut rubber stamp, because the material is
 challenging, and he wouldn't get it (I love math, and I've had to read
 some papers 10+ times before they click).

I do not think that you can compare abstract vs real world. 
Joel talks in the context of solving real-world problems for a living
and producing tangible results to justify employment. It is only fair 
to talk about mathematics in the same context. Or vice-versa.

Joining-the-trolling-bandwagon,
Ramit


Ramit Prasad | JPMorgan Chase Investment Bank | Currencies Technology
712 Main Street | Houston, TX 77002
work phone: 713 - 216 - 5423

--

This email is confidential and subject to important disclaimers and
conditions including on offers for the purchase or sale of
securities, accuracy and completeness of information, viruses,
confidentiality, legal privilege, and legal entity disclaimers,
available at http://www.jpmorgan.com/pages/disclosures/email.  
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-30 Thread Nathan Rice
On Fri, Mar 30, 2012 at 12:20 PM, Chris Angelico ros...@gmail.com wrote:
 On Sat, Mar 31, 2012 at 12:46 AM, Nathan Rice
 nathan.alexander.r...@gmail.com wrote:
 I believe in the idea of things should be as simple as possible, but
 not simpler.  Programming as it currently exists is absolutely
 convoluted.  I am called on to help people learn to program from time
 to time, and I can tell you that we still have a LONG way to go before
 programming approaches a global optimum in either the semantic or
 syntactic space.

 Aside from messes of installing and setting up language
 interpreters/compilers, which are averted by simply having several
 pre-installed (I think my Linux boxes come with some Python 2 version,
 Perl, bash, and a few others), that isn't really the case. Starting a
 Python script is easy. Programming gets convoluted only when, and to
 the extent that, the task does.

It is true that program complexity is correlated with problem
complexity, language and environment complexity is undeniable.  If you
want to prove this to yourself, find someone who is intelligent and
has some basic level of computer literacy, sit them down at a computer
and ask them to solve simple problems using programs.  You could even
be nice by opening the editor first.  Don't help them, just watch them
crash and burn.  Then sit them in front of code that already works,
and ask them to modify it to do something slightly different, and
again just watch them crash and burn in all but the simplest of cases.
 It is painful - most of the time they just give up.  These same
people almost universally can describe the exact process of steps
verbally or in writing to do what is required without any trouble;
there might be some neglected edge cases, but if you describe the
failed result, often times they will realize their mistake and be able
to fix it quickly.

Jeff Atwood had an article about programming sheep and non programming
goats, and while he views it as a statement about people's failings, I
view it as a statement about the failings of programming.  Human
computer interaction is really important, and the whole prefab GUI
concept doesn't scale along any axis; people need to be able to
interact with their computers in a flexible manner.  In the beginning,
we had no choice but to bend our communication to the the machine, but
we're moving past that now.  The machine should respect the
communication of humans.  We shouldn't decry natural language because
of ambiguity; If we're ambiguous, the machine should let us know and
ask for clarification.  If we phrase things in a really awkward way,
the machine should tell us so, and suggest a more understandable
rendition (just like word processors do now).  If the machine does
something we didn't want based on our instructions, we should be able
to state additional requirements in a declarative manner.  Restricted
natural languages are an active area of current research, and they
clearly demonstrate that you can have an expressive formal language
that is also valid English.

Make no mistake about it, programming is a form of computer human
interaction (I figured that would be an accepted mantra here).  Think
of it as modeling knowledge and systems instead of pushing bits
around.  You are describing things to the computer.  To move from the
syntactic domain to my point about programming languages, imagine if
one person describes physics to a computer in French, and another
person describes chemistry to a computer in English.  The creators of
the computer made different languages act as disjoint knowledge
domains.  The computer is incapable of making any inferences in
physics which are informed by chemistry, and vice versa, unless
someone comes along and re-describes one of the disciplines in the
other language.  Worse still, if someone that only speaks Mandarin
comes along, the computer won't be able to tell him anything about
either domain.  Now imagine the creators of the computer decided that
an acceptable solution was to have people print out statements from
one domain in a given language, take it to another computer that scans
the printout, translates it to a different language, and prints out
the translated copy, then have that person take the translated copy
back to the original computer, and scan it again in order to ask a
cross cutting question. I hope from that perspective the paucity of
our current methods will be more apparent.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-30 Thread Chris Angelico
On Sat, Mar 31, 2012 at 5:15 AM, Nathan Rice
nathan.alexander.r...@gmail.com wrote:
 It is true that program complexity is correlated with problem
 complexity, language and environment complexity is undeniable.  If you
 want to prove this to yourself, find someone who is intelligent and
 has some basic level of computer literacy, sit them down at a computer
 and ask them to solve simple problems using programs.  You could even
 be nice by opening the editor first.  Don't help them, just watch them
 crash and burn.  Then sit them in front of code that already works,
 and ask them to modify it to do something slightly different, and
 again just watch them crash and burn in all but the simplest of cases.
  It is painful - most of the time they just give up.  These same
 people almost universally can describe the exact process of steps
 verbally or in writing to do what is required without any trouble;
 there might be some neglected edge cases, but if you describe the
 failed result, often times they will realize their mistake and be able
 to fix it quickly.

This is more a matter of being unable to express themselves
appropriately. If I allowed them to write an exact process of steps to
do what's required, those steps would either be grossly insufficient
for the task, or would BE pseudo-code. There are plenty of people who
cannot write those sorts of instructions at all. They're called
non-programmers. Anyone who doesn't code but can express a task in
such clear steps as you describe is what I would call a non-coding
programmer - and such people are VERY easily elevated to full
programmer status. I've worked with several like that, and the border
between that kind of clear, concise, simple instruction list and
actual Python or REXX code is so small as to be almost nonexistent.
It's not the programming languages' fault. It's a particular jump in
thinking that must be overcome before a person can use them.

There are other similar jumps in thinking. On which side of these
lines are you? Do you remember making the shift? Or, conversely, do
you stare at it with Huh? Why would I do that??

* Source control. Don't just keep daily backups - record specific
purposeful changes in a log.

* WYSIWYG document editing vs plain-text with a compiler. Pass up Open
Office (or worse) in favour of LaTeX, abandon Sibelius in favour of
Lilypond. Plays very nicely with source control.

* Unicode instead of head-in-the-sand pretending that ASCII is good enough.

* Open standards and toolchains instead of expecting monolithic
proprietary programs to do your work for you.

Etcetera, etcetera. Everyone who's made the jump will see the benefit
of the side they're on; most who haven't won't. Same with
non-programmers to programmers. Why should I write like that when I
could just write English? Simple: Because dedicated programming
languages are far more expressive for the job.

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-30 Thread Nathan Rice
 This is more a matter of being unable to express themselves
 appropriately. If I allowed them to write an exact process of steps to
 do what's required, those steps would either be grossly insufficient
 for the task, or would BE pseudo-code. There are plenty of people who
 cannot write those sorts of instructions at all. They're called
 non-programmers. Anyone who doesn't code but can express a task in
 such clear steps as you describe is what I would call a non-coding
 programmer - and such people are VERY easily elevated to full
 programmer status. I've worked with several like that, and the border
 between that kind of clear, concise, simple instruction list and
 actual Python or REXX code is so small as to be almost nonexistent.
 It's not the programming languages' fault. It's a particular jump in
 thinking that must be overcome before a person can use them.

Your statement that the difference between Python or REXX and
pseudo-code is almost non existent is completely false.  While people
reading Python might be able to guess with higher accuracy what a
program does than some other programming languages, there is still a
set of VERY specific set of glyphs, words and phrase structures it
requires.

Pretty much anyone can follow a recipe to make a meal (and there are a
lot other examples of this), and conversely given the knowledge of how
to make some dish, pretty much everyone could describe the process as
a recipe.  The same person will fail miserably when trying to create
working code that is MUCH simpler from a conceptual standpoint.  Non
coders are not stupid, they just don't appreciate the multitude of
random distinctions and computer specific abstractions programming
foists on them for the purpose of writing EFFICIENT code.  I'm talking
about like multiple int/number types, umpteen different kinds of
sequences, tons of different data structures that are used for
basically the same things under different circumstances, indices
starting at 0 (which makes amazing sense if you think like a machine,
and very little if you think like a human), the difference between
logical and bitwise operators (union and intersection would be better
names for the latter), string encoding, etc.  When you combine these
with having to communicate in a new (very arbitrary, sometimes
nonsensical) vocabulary that doesn't recognize synonyms, using an
extremely restricted phrase structure and an environment with very
limited interactivity, it should become clear that the people who
learn to program are invariably fascinated by computers and very
motivated to do so.

I'm going to assume that you didn't mean that non coders are
incapable of instructing others (even though your statement implies it
directly).  I think the ability of non coders to describe procedures
would surprise you.  Things I hear over and over from non coders all
tie into people being frustrated that computers don't grasp
similarity, and don't try to figure out what they want at all; most
people provide instructions in an interactive manner.  The computer is
too stupid to interact with humans, so you have to tell it what to do,
then try to run it, watch it fail, then go back and modify your
program, which is highly unnatural.

I think you'd find that these non coders would do very well if given
the ability to provide instructions in a natural, interactive way.
They are not failing us, we are failing them.

 Etcetera, etcetera. Everyone who's made the jump will see the benefit
 of the side they're on; most who haven't won't. Same with
 non-programmers to programmers. Why should I write like that when I
 could just write English? Simple: Because dedicated programming
 languages are far more expressive for the job.

Really?  Or could it be that algorithms for natural language
processing that don't fail miserably is a very recent development,
restricted natural languages more recent still, and pretty much all
commonly used programming languages are all ~20+ years old?  Could it
also be that most programmers don't see a lot of incentives to make
things accessible, since they're already invested in the status quo,
and they might lose some personal value if programming stopped being
an arcane practice?

Creating a programming language is a time consuming and laborious
process, the fact that people are doing it constantly is a clear
indication that what we have is insufficient.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-30 Thread Chris Angelico
On Sat, Mar 31, 2012 at 6:55 AM, Nathan Rice
nathan.alexander.r...@gmail.com wrote:
 I think you'd find that these non coders would do very well if given
 the ability to provide instructions in a natural, interactive way.
 They are not failing us, we are failing them.

The nearest thing to natural-language command of a computer is voice
navigation, which is another science that's plenty old and yet still
current (I first met it back in 1996 and it wasn't new then). You tell
the computer what you want it to do, and it does it. Theoretically.
The vocabulary's a lot smaller than all of English, of course, but
that's not a problem. The problem is that it's really REALLY slow to
try to get anything done in English, compared to a dedicated
domain-specific language (in the case of typical OS voice navigation,
the nearest equivalent would probably be a shell script).

 Really?  Or could it be that algorithms for natural language
 processing that don't fail miserably is a very recent development,
 restricted natural languages more recent still, and pretty much all
 commonly used programming languages are all ~20+ years old?  Could it
 also be that most programmers don't see a lot of incentives to make
 things accessible, since they're already invested in the status quo,
 and they might lose some personal value if programming stopped being
 an arcane practice?

Totally. That's why we're all still programming in assembly language
and doing our own memory management, because we would lose a lot of
personal value if programming stopped being so difficult. If it
weren't for all these silly new-fangled languages with their automatic
garbage collection and higher order function handling, we would all be
commanding much higher salaries.

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-30 Thread Neil Cerutti
On 2012-03-30, Nathan Rice nathan.alexander.r...@gmail.com wrote:
 Restricted natural languages are an active area of current
 research, and they clearly demonstrate that you can have an
 expressive formal language that is also valid English.

See, for example, Inform 7, which translates a subset of English
into Inform 6 code. I never thought too deeply about why I
disliked it, assuming it was because I already knew Inform 6.
Would you like to write the equivalent, e.g., C code in English?

-- 
Neil Cerutti
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-30 Thread Dan Sommers
On Sat, 31 Mar 2012 07:20:39 +1100
Chris Angelico ros...@gmail.com wrote:

 ... That's why we're all still programming in assembly language and
 doing our own memory management, because we would lose a lot of
 personal value if programming stopped being so difficult. If it
 weren't for all these silly new-fangled languages with their automatic
 garbage collection and higher order function handling, we would all be
 commanding much higher salaries.

Back in the 1970's, the magazines were full of ads for never write
another line of code again programs.  A keystroke here, a keystroke
there (those were the days *before* drag-and-drop and point-and-drool),
and even managers and executives could write programs.  Now, of
course, those managers and executives still command higher salaries, so
I guess ChrisA is right about us assembly language guys losing our
personal value.

Dan
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-30 Thread Nathan Rice
On Fri, Mar 30, 2012 at 4:20 PM, Chris Angelico ros...@gmail.com wrote:
 On Sat, Mar 31, 2012 at 6:55 AM, Nathan Rice
 nathan.alexander.r...@gmail.com wrote:
 I think you'd find that these non coders would do very well if given
 the ability to provide instructions in a natural, interactive way.
 They are not failing us, we are failing them.

 The nearest thing to natural-language command of a computer is voice
 navigation, which is another science that's plenty old and yet still
 current (I first met it back in 1996 and it wasn't new then). You tell
 the computer what you want it to do, and it does it. Theoretically.
 The vocabulary's a lot smaller than all of English, of course, but
 that's not a problem. The problem is that it's really REALLY slow to
 try to get anything done in English, compared to a dedicated
 domain-specific language (in the case of typical OS voice navigation,
 the nearest equivalent would probably be a shell script).

I'm sure a ford truck would smoke a twin engine cessna if you compare
their speed on the ground.  Let the cessna fly and the ford doesn't
have a snowball's chance.

If you're navigating by going cee dee space slash somefolder slash
some other folder slash some third folder slash semicolon emacs
somename dash some other name dash something dot something else dot
one the analogy would be a boss telling his secretary to reserve him
a flight by saying visit site xyz, click on this heading, scroll
halfway down, open this menu, select this destination, ... instead of
book me a flight to San Jose on the afternoon of the 23rd, and don't
spend more than $500.

 Totally. That's why we're all still programming in assembly language
 and doing our own memory management, because we would lose a lot of
 personal value if programming stopped being so difficult. If it
 weren't for all these silly new-fangled languages with their automatic
 garbage collection and higher order function handling, we would all be
 commanding much higher salaries.

Did you miss the fact that a 50 year old programming language (which
still closely resembles its original form) is basically tied for the
title of currently most popular, and the 3 languages following it are
both nominal and spiritual successors, with incremental improvements
in features but sharing a large portion of the design.  Programming
language designers purposefully try to make their language C-like,
because not being C-like disqualifies a language from consideration
for a HUGE portion of programmers, who cower at the naked feeling they
get imagining a world without curly braces.  Fear of change and the
unknown are brutal, and humans are cowardly creatures that will grasp
at whatever excuses they can find not to acknowledge their weaknesses.

I also mentioned previously, most developers are just trying to graft
shortcut after shortcut on to what is comfortable and familiar because
we're inherently lazy.

Additionally, I'm quite certain that when we finally do have a method
for programming/interacting with computers in a natural way, many
people invested in previous methods will make snarky comments about
how lame and stupid people using the new methods are, just like we saw
with command line/keyboard elitists who make fun of people who prefer
a mouse/gui, even though in most cases research showed that the people
using the mouse/gui actually got work done faster.  You can even look
at some comments on this thread for evidence of this.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-30 Thread Chris Angelico
On Sat, Mar 31, 2012 at 7:58 AM, Nathan Rice
nathan.alexander.r...@gmail.com wrote:
 Programming
 language designers purposefully try to make their language C-like,
 because not being C-like disqualifies a language from consideration
 for a HUGE portion of programmers, who cower at the naked feeling they
 get imagining a world without curly braces.  Fear of change and the
 unknown are brutal, and humans are cowardly creatures that will grasp
 at whatever excuses they can find not to acknowledge their weaknesses.

Braces are clear delimiters. English doesn't have them, and suffers
for it. (Python's indentation is, too, but English doesn't have that
either.) It's a lot harder to mark the end of an if block in English
than in pretty much any programming language.

And be careful of what has to be given up to gain your conveniences.
I've used languages that demand variable declarations and ones that
don't, and I'm very much a fan of the former. There are many benefits
to being explicit about that.

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-30 Thread Nathan Rice
On Fri, Mar 30, 2012 at 5:45 PM, Chris Angelico ros...@gmail.com wrote:
 On Sat, Mar 31, 2012 at 7:58 AM, Nathan Rice
 nathan.alexander.r...@gmail.com wrote:
 Programming
 language designers purposefully try to make their language C-like,
 because not being C-like disqualifies a language from consideration
 for a HUGE portion of programmers, who cower at the naked feeling they
 get imagining a world without curly braces.  Fear of change and the
 unknown are brutal, and humans are cowardly creatures that will grasp
 at whatever excuses they can find not to acknowledge their weaknesses.

 Braces are clear delimiters. English doesn't have them, and suffers
 for it. (Python's indentation is, too, but English doesn't have that
 either.) It's a lot harder to mark the end of an if block in English
 than in pretty much any programming language.

It seems to me that Indented blocks of text are used pretty frequently
to denote definition bodies, section subordinate paragraphs and
asides.  The use of the colon seems pretty natural too.  Parentheses
are fairly natural for small asides.  The notion of character
delimiters for large sections of text is actually pretty unnatural
with the exception of  quotes.

 And be careful of what has to be given up to gain your conveniences.
 I've used languages that demand variable declarations and ones that
 don't, and I'm very much a fan of the former. There are many benefits
 to being explicit about that.

I don't like declarations, my personal preference is to have typed
signatures, and implicit declaration with type inference elsewhere.  I
view it as a matter of personal preference though, the result should
be the same, and it should be possible to view the code either way.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-30 Thread Terry Reedy

On 3/30/2012 6:47 AM, Steven D'Aprano wrote:


Spolsky has written at least three times about Architecture Astronauts,
and made it abundantly clear that the problem with them is that they
don't solve problems, they invent overarching abstractions that don't do
anything useful or important, and hype them everywhere.

http://www.joelonsoftware.com/articles/fog18.html
http://www.joelonsoftware.com/items/2005/10/21.html
http://www.joelonsoftware.com/items/2008/05/01.html

Jeff Attwood provides a simple test for the difference between a useful
abstraction and an Architecture Astronaut hyper-abstraction:

 Does it solve a useful problem?

http://www.codinghorror.com/blog/2004/12/it-came-from-planet-architecture.html


My strong impression is that theoretical abstract mathematicians also 
prefer that hi-level abstractions solve some useful-to-mathematicians 
mathematical problem.


--
Terry Jan Reedy

--
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-29 Thread Nathan Rice
On Wed, Mar 28, 2012 at 9:33 PM, Chris Angelico ros...@gmail.com wrote:
 On Thu, Mar 29, 2012 at 11:59 AM, Rodrick Brown rodrick.br...@gmail.com 
 wrote:
 The best skill any developer can have is the ability to pickup languages 
 very quickly and know what tools work well for which task.

 Definitely. Not just languages but all tools. The larger your toolkit
 and the better you know it, the more easily you'll be able to grasp
 the tool you need.

The thing that bothers me is that people spend time and mental energy
on a wide variety of syntax when the semantics are ~90% identical in
most cases (up to organization).

We would be better off if all the time that was spent on learning
syntax, memorizing library organization and becoming proficient with
new tools was spent learning the mathematics, logic and engineering
sciences.  Those solve problems, languages are just representations.

Unfortunately, programming languages seem to have become a way to
differentiate yourself and establish sub-cultural membership.  All the
cool kids are using XYZ, people who use LMN are dorks!  Who cares
about sharing or compatibility!

Human nature is depressingly self-defeating.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-29 Thread Albert van der Horst
In article mailman.896.1332440814.3037.python-l...@python.org,
Nathan Rice  nathan.alexander.r...@gmail.com wrote:

 http://www.joelonsoftware.com/articles/fog18.html

I read that article a long time ago, it was bullshit then, it is
bullshit now.  The only thing he gets right is that the Shannon
information of a uniquely specified program is proportional to the
code that would be required to generate it.  Never mind that if a

Thank you for drawing my attention to that article.
It attacks the humbug software architects.
Are you one of them?
I really liked that article.

program meets a specification, you shouldn't care about any of the
values used for unspecified parts of the program.  If you care about
the values, they should be specified.  So, if Joel had said that the
program was uniquely specified, or that none of the things that
weren't specified require values in the programming language, he might
have been kinda, sorta right.  Of course, nobody cares enough to
specify every last bit of minutiae in a program, and specifications
change, so it is pretty much impossible to imagine either case ever
actually occurring.

I wonder if you're not talking about a different article.

SNIP

Groetjes Albert

--
-- 
Albert van der Horst, UTRECHT,THE NETHERLANDS
Economic growth -- being exponential -- ultimately falters.
albert@spearc.xs4all.nl =n http://home.hccnet.nl/a.w.m.van.der.horst

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-29 Thread Chris Angelico
On Fri, Mar 30, 2012 at 12:44 AM, Nathan Rice
nathan.alexander.r...@gmail.com wrote:
 We would be better off if all the time that was spent on learning
 syntax, memorizing library organization and becoming proficient with
 new tools was spent learning the mathematics, logic and engineering
 sciences.  Those solve problems, languages are just representations.

Different languages are good at different things. REXX is an efficient
text parser and command executor. Pike allows live updates of running
code. Python promotes rapid development and simplicity. PHP makes it
easy to add small amounts of scripting to otherwise-static HTML pages.
C gives you all the power of assembly language with all the
readability of... assembly language. SQL describes a database request.

You can't merge all of them without making a language that's
suboptimal at most of those tasks - probably, one that's woeful at all
of them. I mention SQL because, even if you were to unify all
programming languages, you'd still need other non-application
languages to get the job done.

Keep the diversity and let each language focus on what it's best at.

ChrisA
who has lots and lots of hammers, so every problem looks like... lots
and lots of nails.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-29 Thread Devin Jeanpierre
On Thu, Mar 29, 2012 at 10:03 AM, Chris Angelico ros...@gmail.com wrote:
 You can't merge all of them without making a language that's
 suboptimal at most of those tasks - probably, one that's woeful at all
 of them. I mention SQL because, even if you were to unify all
 programming languages, you'd still need other non-application
 languages to get the job done.

Not really. You can turn SQL (or something equivalent) into a subset
of your programming language, like C# does with LINQ, or like Scheme
does with macros. The Scheme approach generalizes to programming
languages in general with even some fairly alien semantics (e.g. you
can do prolog using macros and first-class continuations). In fact,
for a more difficult target, I even recently saw an implementation of
Python in Common-Lisp that uses reader macros to compile a subset of
Python to equivalent Common-Lisp code:
http://common-lisp.net/project/clpython/

On the other hand, even similar languages are really hard to run in
the same VM: imagine the hoops you'd have to jump through to get
libraries written in Python 2 and 3 to work together. For a more
concrete example, take the attempt to make elisp and guile work
together in guilemacs:
http://www.red-bean.com/guile/notes/emacs-lisp.html

But this has nothing to do with being suboptimal at most tasks. It's
easy to make a language that can do everything C can do, and also
everything that Haskell can do. I can write an implementation of this
programming language in one line of bash[*]. The easy way is to make
those features mutually exclusive. We don't have to sacrifice anything
by including more features until we want them to work together.

With that in mind, the interesting languages to merge aren't things
like SQL or regular expressions -- these are so easy to make work with
programming languages, that we do it all the time already (via string
manipulation, but first-class syntax would also be easily possible).
The hard problems are when trying to merge in the semantics of
languages that only make sense because they have drastically
different expectations of the world. The example that comes to mind is
Haskell, which relies incredibly strongly on the lack of side effects.
How do you merge Haskell and Python?

Well, you can't. As soon as you add side-effects, you can no longer
rely on the weak equivalence of things executed eagerly versus lazily,
and the semantics of Haskell go kaput. So the only actual effort (that
I am aware of) to implement side-effects with Haskell *deliberately*
makes mutability and laziness mutually exclusive. Anything else is
impossible. The effort mentioned here is called Disciple, and the
relevant thesis is very fun reading, check it out:
http://www.cse.unsw.edu.au/~benl/papers/thesis/lippmeier-impure-world.pdf

I guess what I really want to say is that the world looks, to me, to
be more optimistic than so many people think it is. If we wanted to,
we could absolutely take the best features from a bunch of things.
This is what C++ does, this is what Scheme does, this is what D does.
They sometimes do it in different ways, they have varying tradeoffs,
but this isn't a hard problem except when it is, and the examples you
mentioned are actually the easy cases. We can merge Python and C,
while keeping roughly the power of both, it's called Cython. We can
merge Python and PHP, in that PHP adds nothing incompatible with
Python technically (it'd be a lot of work and there would be many
tears shed because it's insane) -- but Python Server Pages probably
add the feature you want. We could merge SQL and Python, arguably we
already do via e.g. SQLAlchemy's query API (etc.) or DBAPI2's string
API. These can all becomes subsets of a language that interoperate
well with the rest of the language with no problems. These are
non-issues: the reasons for not doing so are not technical, they are
political or sociological (e.g., bloat the language, there should
be one obvious way to do it, PHP's mixing of business logic with
presentation logic is bad, etc.)

There _are_ times when this is technical, and there are specific areas
of this that have technical difficulties, but... that's different, and
interesting, and being actively researched, and not really impossible
either.

I don't know. This is maybe a bit too rant-y and disorganized; if so I
apologize. I've been rethinking a lot of my views on programming
languages lately. :)  I hope at least the links help make this
interesting to someone.

-- Devin

[*] A language is really just a set of programs that compile. If we
assume that the set of haskell and C programs are disjoint, then we
can create a new language that combines both of them, by trying the C
(or Haskell) compiler first, and then running the other if that should
fail. This is really an argument from the absurd, though. I just said
it 'cause it sounds awesome.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-29 Thread Nathan Rice
On Thu, Mar 29, 2012 at 10:03 AM, Chris Angelico ros...@gmail.com wrote:
 On Fri, Mar 30, 2012 at 12:44 AM, Nathan Rice
 nathan.alexander.r...@gmail.com wrote:
 We would be better off if all the time that was spent on learning
 syntax, memorizing library organization and becoming proficient with
 new tools was spent learning the mathematics, logic and engineering
 sciences.  Those solve problems, languages are just representations.

 Different languages are good at different things. REXX is an efficient
 text parser and command executor. Pike allows live updates of running
 code. Python promotes rapid development and simplicity. PHP makes it
 easy to add small amounts of scripting to otherwise-static HTML pages.
 C gives you all the power of assembly language with all the
 readability of... assembly language. SQL describes a database request.

Here's a thought experiment.  Imagine that you have a project tree on
your file system which includes files written in many different
programming languages.  Imagine that the files can be assumed to be
contiguous for our purposes, so you could view all the files in the
project as one long chunk of data.  The directory and file names could
be interpreted as statements in this data, analogous to in the
context of somedirectory or in the context of somefile with
sometype.  Any project configuration files could be viewed as
declarative statements about contexts, such as in xyz context, ignore
those or in abc context, any that is actually a this.  Imagine the
compiler or interpreter is actually part of your program (which is
reasonable since it doesn't do anything by itself).  Imagine the build
management tool is also part of your program in pretty much the same
manner.  Imagine that your program actually generates another program
that will generate the program the machine runs.  I hope you can
follow me here, and further I hope you can see that this is a
completely valid description of what is actually going on (from a
different perspective).

In the context of the above thought experiment, it should be clear
that we currently have something that is a structural analog of a
single programming metalanguage (or rather, one per computer
architecture), with many domain specific languages constructed above
that to simplify tasks in various contexts.  The model I previously
proposed is not fantasy, it exists, just not in a form usable by human
beings.  Are machine instructions the richest possible metalanguage?
I really doubt it.

Lets try another thought experiment... Imagine that instead of having
machine instructions as the common metalanguage, we pushed the point
of abstraction closer to something programmers can reasonably work
with: abstract syntax trees.  Imagine all programming languages share
a common abstract syntax tree format, with nodes generated using a
small set of human intelligible semantic primes.  Then, a domain
specific language is basically a context with a set of logical
implications.  By associating a branch of the tree to one (or the
union of several) context, you provide a transformation path to
machine instructions via logical implication.  If implications of a
union context for the nodes in the branch are not compatible, this
manifests elegantly in the form of a logical contradiction.

What does pushing the abstraction point that far up provide?  For one,
you can now reason across language boundaries.  A compiler can tell me
if my prolog code and my python code will behave properly together.
Another benefit is that you make explicit the fact that your parser,
interpreter, build tools, etc are actually part of your program, from
the perspective that your program is actually another program that
generates programs in machine instructions.  By unifying your build
chain, it makes deductive inference spanning steps and tools possible,
and eliminates some needless repetition.  This also greatly simplifies
code reuse, since you only need to generate a syntax tree of the
proper format and associate the correct context to it.  It also
simplifies learning languages, since people only need to understand
the semantic primes in order to read anything.

Of course, this describes Lisp to some degree, so I still need to
provide some answers.  What is wrong with Lisp?  I would say that the
base syntax being horrible is probably the biggest issue.  Beyond
that, transformations on lists of data are natural in Lisp, but graph
transformations are not, making some things awkward.  Additionally,
because Lisp tries to nudge you towards programming in a functional
style, it can be un-intuitive to learn.  Programming is knowledge
representation, and state is a natural concept that many people desire
to model, so making it a second class citizen is a mistake.  If I were
to re-imagine Lisp for this purpose, I would embrace state and an
explicit notion of temporal order.  Rather than pretending it didn't
exist, I would focus on logical and mathematical machinery necessary
to 

Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-29 Thread Tim Chase

On 03/29/12 12:48, Nathan Rice wrote:

Of course, this describes Lisp to some degree, so I still need to
provide some answers.  What is wrong with Lisp?  I would say that the
base syntax being horrible is probably the biggest issue.


Do you mean something like:

((so (describes Lisp (to degree some) (of course)) still-need 
(provide I some-answers)) (is wrong what (with Lisp)) (would-say 
I ((is (base-syntax being-horrible) (probably-biggest issue)


nah...can't fathom what's wrong with that...

«grins, ducks, and runs»

-tkc



--
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-29 Thread Devin Jeanpierre
Agreed with your entire first chunk 100%. Woohoo! High five. :)

On Thu, Mar 29, 2012 at 1:48 PM, Nathan Rice
nathan.alexander.r...@gmail.com wrote:
 transformations on lists of data are natural in Lisp, but graph
 transformations are not, making some things awkward.

Eh, earlier you make some argument towards lisp being a universal
metalanguage. If it can simulate prolog, it can certainly grow a graph
manipulation form. You'd just need to code it up as a macro or
function :p

 Additionally,
 because Lisp tries to nudge you towards programming in a functional
 style, it can be un-intuitive to learn.

I think you're thinking of Scheme here. Common Lisp isn't any more
functional than Python, AFAIK (other than having syntactic heritage
from the lambda calculus?)

Common-Lisp does very much embrace state as you later describe, Scheme
much less so (in that it makes mutating operations more obvious and
more ugly. Many schemes even outlaw some entirely. And quoted lists
default to immutable (rgh)).

 I'm all for diversity of language at the level of minor notation and
 vocabulary, but to draw an analogy to the real world, English and
 Mandarin are redundant, and the fact that they both creates a
 communication barrier for BILLIONS of people.  That doesn't mean that
 biologists shouldn't be able to define words to describe biological
 things, if you want to talk about biology you just need to learn the
 vocabulary.  That also doesn't mean or that mathematicians shouldn't
 be able to use notation to structure complex statements, if you want
 to do math you need to man up and learn the notation (of course, I
 have issues with some mathematical notation, but there is no reason
 you should cry about things like set builder).

Well, what sort of language differences make for English vs Mandarin?
Relational algebraic-style programming is useful, but definitely a
large language barrier to people that don't know any SQL. I think this
is reasonable. (It would not matter even if you gave SQL python-like
syntax, the mode of thinking is different, and for a good reason.)

-- Devin
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-29 Thread Nathan Rice
On Thu, Mar 29, 2012 at 2:53 PM, Devin Jeanpierre
jeanpierr...@gmail.com wrote:
 Agreed with your entire first chunk 100%. Woohoo! High five. :)

Damn, then I'm not trolling hard enough ಠ_ಠ

 On Thu, Mar 29, 2012 at 1:48 PM, Nathan Rice
 nathan.alexander.r...@gmail.com wrote:
 transformations on lists of data are natural in Lisp, but graph
 transformations are not, making some things awkward.

 Eh, earlier you make some argument towards lisp being a universal
 metalanguage. If it can simulate prolog, it can certainly grow a graph
 manipulation form. You'd just need to code it up as a macro or
 function :p

Well, a lisp-like language.  I would also argue that if you are using
macros to do anything, the thing you are trying to do should classify
as not natural in lisp :)

I'm really thinking here more in terms of a general graph reactive
system here, matching patterns in an input graph and modifying the
graph in response.  There are a lot of systems that can be modeled as
a graph that don't admit a nested list (tree) description.  By having
references to outside the nesting structure you've just admitted that
you need a graph rather than a list, so why not be honest about it and
work in that context from the get-go.

 Additionally,
 because Lisp tries to nudge you towards programming in a functional
 style, it can be un-intuitive to learn.

 I think you're thinking of Scheme here. Common Lisp isn't any more
 functional than Python, AFAIK (other than having syntactic heritage
 from the lambda calculus?)

 Common-Lisp does very much embrace state as you later describe, Scheme
 much less so (in that it makes mutating operations more obvious and
 more ugly. Many schemes even outlaw some entirely. And quoted lists
 default to immutable (rgh)).

I find it interesting that John McCarthy invented both Lisp and the
situation calculus.

As for set/setq, sure, you can play with state, but it is verbose, and
there is no inherent notion of temporal locality.  Your program's
execution order forms a nice lattice when run on hardware, that should
be explicit in software.  If I were to do something crazy like take
the union of two processes that can potentially interact, with an
equivalence relation between some time t1 in the first process and a
time t2 in the second (so that you can derive a single partial order),
the computer should be able to tell if I am going to shoot myself in
the foot, and ideally suggest the correct course of action.

 Well, what sort of language differences make for English vs Mandarin?
 Relational algebraic-style programming is useful, but definitely a
 large language barrier to people that don't know any SQL. I think this
 is reasonable. (It would not matter even if you gave SQL python-like
 syntax, the mode of thinking is different, and for a good reason.)

I don't think they have to be.  You can view functions as names for
temporally ordered sequence of declarative implication statements.
Databases just leave out the logic (this is hyperbole, I know), so you
have to do it in client code.  I don't feel that a database
necessarily has to be a separate entity, that is just an artifact of
the localized, specialized view of computation.  As stronger
abstractions are developed and concurrent, distributed computation is
rigorously systematized, I think we'll go full circle.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-29 Thread Chris Angelico
On Fri, Mar 30, 2012 at 3:42 AM, Devin Jeanpierre
jeanpierr...@gmail.com wrote:
 On Thu, Mar 29, 2012 at 10:03 AM, Chris Angelico ros...@gmail.com wrote:
 You can't merge all of them without making a language that's
 suboptimal at most of those tasks - probably, one that's woeful at all
 of them. I mention SQL because, even if you were to unify all
 programming languages, you'd still need other non-application
 languages to get the job done.
 ...
 But this has nothing to do with being suboptimal at most tasks. It's
 easy to make a language that can do everything C can do, and also
 everything that Haskell can do. I can write an implementation of this
 programming language in one line of bash[*]. The easy way is to make
 those features mutually exclusive. We don't have to sacrifice anything
 by including more features until we want them to work together.

Of course it's POSSIBLE. You can write everything in Ook if you want
to. But any attempt to merge all programming languages into one will
either:

1) Allow different parts of a program to be written in different
subsets of this universal language, which just means that you've
renamed all the languages but kept their distinctions (so a programmer
still has to learn all of them); or

2) Shoehorn every task into one language, equivalent to knowing only
one language and using that for everything. Good luck with that.

The debate keeps on coming up, but it's not just political decisions
that maintain language diversity.

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-29 Thread Nathan Rice
On Thu, Mar 29, 2012 at 9:44 AM, Albert van der Horst
alb...@spenarnc.xs4all.nl wrote:
 In article mailman.896.1332440814.3037.python-l...@python.org,
 Nathan Rice  nathan.alexander.r...@gmail.com wrote:

 http://www.joelonsoftware.com/articles/fog18.html

I read that article a long time ago, it was bullshit then, it is
bullshit now.  The only thing he gets right is that the Shannon
information of a uniquely specified program is proportional to the
code that would be required to generate it.  Never mind that if a

 Thank you for drawing my attention to that article.
 It attacks the humbug software architects.
 Are you one of them?
 I really liked that article.

I read the first paragraph, remembered that I had read it previously
and stopped.  I accidentally remembered something from another Joel
article as being part of that article (read it at
http://www.joelonsoftware.com/items/2007/12/03.html).  I don't really
have anything to say on Joel's opinions about why people can or should
code, their his and he is entitled to them.  I feel they are overly
reductionist (this isn't a black/white thing) and have a bit of
luddite character to them.  I will bet you everything I own the only
reason Joel is alive today because of some mathematical abstraction he
would be all too happy to discount as meaningless (because, to him, it
is).  Of course, I will give Joel one point: too many things related
to programming are 100% hype, without any real substance; if his
article had been about bullshit software hype and he hadn't fired the
broadsides at the very notion of abstraction, I wouldn't have anything
to say.

Anyhow, if you ugh rock good caveman smash gazelle put in mouth make
stomach pain go away meaning, here it is:  Programs are knowledge.
The reverse is not true, because programming is an infantile area of
human creation, mere feet from the primordial tide pool from whence it
spawned.  We have a very good example of what a close to optimal
outcome is: human beings - programs that write themselves, all
knowledge forming programs, strong general artificial intelligence.
When all knowledge is also programs, we will have successfully freed
ourselves from necessary intellectual drudgery (the unnecessary kind
will still exist).  We will be able to tell computers what we want on
our terms, and they will go and do it, checking in with us from time
to time if they aren't sure what we really meant in the given context.
 If we have developed advanced robotics, we will simultaneously be
freed from most manual labor.  The only thing left for Joel to do will
be to lounge about, being creative while eating mangos that were
picked, packed, shipped and unloaded by robots, ordered by his
computer assistant because it knows that he likes them, then
delivered, prepared and served by more robots.

The roadblocks in the path include the ability to deal with
uncertainty, understand natural languages and the higher order
characteristics of information.  Baby steps to deal with these
roadblocks are to explicitly forbid uncertainty, simplify the language
used, and explicitly state higher order properties of information.
The natural evolution of the process is to find ways to deal with
ambiguity, correctly parse more complex language and automatically
deduce higher order characteristics of information.

Clearly, human intelligence demonstrates that this is not an
impossible pipe dream.  You may not be interested in working towards
making this a reality, but I can pretty much guarantee on the scale of
human achievement, it is near the top.
-- 
http://mail.python.org/mailman/listinfo/python-list


RE: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-29 Thread Prasad, Ramit
  You can't merge all of them without making a language that's
  suboptimal at most of those tasks - probably, one that's woeful at all
  of them. I mention SQL because, even if you were to unify all
  programming languages, you'd still need other non-application
  languages to get the job done.
  ...
  But this has nothing to do with being suboptimal at most tasks. It's
  easy to make a language that can do everything C can do, and also
  everything that Haskell can do. I can write an implementation of this
  programming language in one line of bash[*]. The easy way is to make
  those features mutually exclusive. We don't have to sacrifice anything
  by including more features until we want them to work together.
 
 Of course it's POSSIBLE. You can write everything in Ook if you want
 to. But any attempt to merge all programming languages into one will
 either:
 
 1) Allow different parts of a program to be written in different
 subsets of this universal language, which just means that you've
 renamed all the languages but kept their distinctions (so a programmer
 still has to learn all of them); or
 
 2) Shoehorn every task into one language, equivalent to knowing only
 one language and using that for everything. Good luck with that.

In a much simpler context, isn't this what .NET's CLR does? Except
that instead of converting each language into each other it converts
everything into a different language. I have trouble in my mind seeing
how what you suggest would not end up with badly coded versions of a
translated program. Never yet seen a program that could convert from one 
paradigm/language directly to another (and do it well/maintainable). 


 The debate keeps on coming up, but it's not just political decisions
 that maintain language diversity.

Not a bad thing in my opinion.  A tool for each problem, but I can 
see the appeal of a multi-tool language. 


Ramit


Ramit Prasad | JPMorgan Chase Investment Bank | Currencies Technology
712 Main Street | Houston, TX 77002
work phone: 713 - 216 - 5423

--
This email is confidential and subject to important disclaimers and
conditions including on offers for the purchase or sale of
securities, accuracy and completeness of information, viruses,
confidentiality, legal privilege, and legal entity disclaimers,
available at http://www.jpmorgan.com/pages/disclosures/email.  
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-29 Thread Devin Jeanpierre
On Thu, Mar 29, 2012 at 4:33 PM, Chris Angelico ros...@gmail.com wrote:
 Of course it's POSSIBLE. You can write everything in Ook if you want
 to. But any attempt to merge all programming languages into one will
 either:

In that particular quote, I was saying that the reason that you
claimed we can't merge languages was not a correct reason. You are now
moving the goalposts, in that you've decided to abandon your original
point. Also you are now discussing the merger of all programming
languages, whereas I meant to talk about pairs of programming
languages. e.g. such as SQL and Python.

Merging all programming languages is ridiculous. Even merging two,
Haskell and C, is impossible without running into massive
world-bending problems. (Yes, these problems are interesting, but no,
they can't be solved without running into your issue 1 -- this is in
fact a proven theorem.)

 1) Allow different parts of a program to be written in different
 subsets of this universal language, which just means that you've
 renamed all the languages but kept their distinctions (so a programmer
 still has to learn all of them); or

Yes. I mentioned this. It is not entirely useless (if you're going to
use the other language _anyway_, like SQL or regexps, might as well
have it be checked at compile-time same as your outer code), but in a
broad sense it's a terrible idea.

Also, programmers would have to learn things regardless. You can't
avoid this, that's what happens when you add features. The goal in
integrating two languages is, well, integration, not reducing
learning.

 2) Shoehorn every task into one language, equivalent to knowing only
 one language and using that for everything. Good luck with that.

This isn't true for the merge just two languages case, which is what
I meant to talk about.

 The debate keeps on coming up, but it's not just political decisions
 that maintain language diversity.

Are you disagreeing with me, or somebody else? I never said that.

Yes, I said that in some cases, e.g. SQL/Python, because there are no
technical issues, it must be something political or stylistic. I
wasn't saying that the only reason we don't merge languages in is
political. As a matter of fact, the very next paragraph begins with
There _are_ times when this is technical.

(political is a bad word for it, because it covers things that are
just plain bad ideas (but, subjectively). For example, there's nothing
technically challenging about adding an operator that wipes the user's
home directory.)

-- Devin
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-29 Thread Devin Jeanpierre
On Thu, Mar 29, 2012 at 3:50 PM, Nathan Rice
nathan.alexander.r...@gmail.com wrote:
 Well, a lisp-like language.  I would also argue that if you are using
 macros to do anything, the thing you are trying to do should classify
 as not natural in lisp :)

You would run into disagreement. Some people feel that the lisp
philosophy is precisely that of extending the language to do anything
you want, in the most natural way.

At least, I disagree, but my lisp thoughts are the result of
indoctrination of the Racket crowd. I don't know how well they
represent the larger lisp community. But you should definitely take
what I say from the viewpoint of the sort of person that believes that
the whole purpose of lisps is to embed new syntax and new DSLs via
macros. Without macros, there's no point of having this despicable
syntax (barring maybe pedagogy and other minor issues).

 I'm really thinking here more in terms of a general graph reactive
 system here, matching patterns in an input graph and modifying the
 graph in response.  There are a lot of systems that can be modeled as
 a graph that don't admit a nested list (tree) description.  By having
 references to outside the nesting structure you've just admitted that
 you need a graph rather than a list, so why not be honest about it and
 work in that context from the get-go.

I don't see any issue in defining a library for working with graphs.
If it's useful enough, it could be added to the standard library.
There's nothing all that weird about it.

Also, most representations of graphs are precisely via a tree-like
non-recursive structure. For example, as a matrix, or adjacency list,
etc. We think of them as deep structures, but implement them as flat,
shallow structures. Specialized syntax (e.g. from macros) can
definitely bridge the gap and let you manipulate them in the obvious
way, while admitting the usual implementation.

 I don't think they have to be.  You can view functions as names for
 temporally ordered sequence of declarative implication statements.
 Databases just leave out the logic (this is hyperbole, I know), so you
 have to do it in client code.  I don't feel that a database
 necessarily has to be a separate entity, that is just an artifact of
 the localized, specialized view of computation.  As stronger
 abstractions are developed and concurrent, distributed computation is
 rigorously systematized, I think we'll go full circle.

Maybe I'm too tired, but this went straight over my head, sorry.
Perhaps you could be a bit more explicit about what you mean by the
implications/logic?

-- Devin
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-29 Thread Steven D'Aprano
On Thu, 29 Mar 2012 14:37:09 -0400, Nathan Rice wrote:

 On Thu, Mar 29, 2012 at 9:44 AM, Albert van der Horst
 alb...@spenarnc.xs4all.nl wrote:
 In article mailman.896.1332440814.3037.python-l...@python.org, Nathan
 Rice  nathan.alexander.r...@gmail.com wrote:

 http://www.joelonsoftware.com/articles/fog18.html

 Of course, I will give Joel one point: too many things related to
 programming are 100% hype, without any real substance; if his article
 had been about bullshit software hype and he hadn't fired the broadsides
 at the very notion of abstraction

He did no such thing. I challenge you to find me one place where Joel has 
*ever* claimed that the very notion of abstraction is meaningless or 
without use.



-- 
Steven
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-29 Thread Nathan Rice
On Thu, Mar 29, 2012 at 7:37 PM, Devin Jeanpierre
jeanpierr...@gmail.com wrote:
 On Thu, Mar 29, 2012 at 3:50 PM, Nathan Rice
 nathan.alexander.r...@gmail.com wrote:
 Well, a lisp-like language.  I would also argue that if you are using
 macros to do anything, the thing you are trying to do should classify
 as not natural in lisp :)

 You would run into disagreement. Some people feel that the lisp
 philosophy is precisely that of extending the language to do anything
 you want, in the most natural way.

That is some people's lisp philosophy, though I wouldn't say that is a
universal.  Just like I might say my take on python's philosophy is
keep it simple, stupid but others could disagree.

 At least, I disagree, but my lisp thoughts are the result of
 indoctrination of the Racket crowd. I don't know how well they
 represent the larger lisp community. But you should definitely take
 what I say from the viewpoint of the sort of person that believes that
 the whole purpose of lisps is to embed new syntax and new DSLs via
 macros. Without macros, there's no point of having this despicable
 syntax (barring maybe pedagogy and other minor issues).

Heh, I think you can have a homoiconic language without nasty syntax,
but I won't get into that right now.

 I'm really thinking here more in terms of a general graph reactive
 system here, matching patterns in an input graph and modifying the
 graph in response.  There are a lot of systems that can be modeled as
 a graph that don't admit a nested list (tree) description.  By having
 references to outside the nesting structure you've just admitted that
 you need a graph rather than a list, so why not be honest about it and
 work in that context from the get-go.

 I don't see any issue in defining a library for working with graphs.
 If it's useful enough, it could be added to the standard library.
 There's nothing all that weird about it.

Graphs are the more general and expressive data structure, I think if
anything you should special case the less general form.

 Also, most representations of graphs are precisely via a tree-like
 non-recursive structure. For example, as a matrix, or adjacency list,
 etc. We think of them as deep structures, but implement them as flat,
 shallow structures. Specialized syntax (e.g. from macros) can
 definitely bridge the gap and let you manipulate them in the obvious
 way, while admitting the usual implementation.

We do a lot of things because they are efficient.  That is why
gaussian distributions are everywhere in statistics, people
approximate nonlinear functions with sums of kernels, etc.  It
shouldn't be the end goal though, unless it really is the most
expressive way of dealing with things.  My personal opinion is that
graphs are more expressive, and I think it would be a good idea to
move towards modeling knowledge and systems with graphical structures.

 I don't think they have to be.  You can view functions as names for
 temporally ordered sequence of declarative implication statements.
 Databases just leave out the logic (this is hyperbole, I know), so you
 have to do it in client code.  I don't feel that a database
 necessarily has to be a separate entity, that is just an artifact of
 the localized, specialized view of computation.  As stronger
 abstractions are developed and concurrent, distributed computation is
 rigorously systematized, I think we'll go full circle.

 Maybe I'm too tired, but this went straight over my head, sorry.
 Perhaps you could be a bit more explicit about what you mean by the
 implications/logic?

Well,  the curry howard correspondance says that every function can be
seen as a named implication of outputs given inputs, with the code for
that function being a representation of its proof.  Since pretty much
every function is a composition of many smaller functions, this holds
down to the lowest level.  Even imperative statements can be viewed as
functions in this light, if you assume discrete time, and view every
function or statement as taking the state of the world at T as an
implicit input and  returning as an implicit output the state of the
world at T+1.  Thus, every function (and indeed pretty much all code)
can be viewed as a named collection of implication statements in a
particular context :)
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-29 Thread Steven D'Aprano
On Thu, 29 Mar 2012 13:48:40 -0400, Nathan Rice wrote:

 Here's a thought experiment.  Imagine that you have a project tree on
 your file system which includes files written in many different
 programming languages.  Imagine that the files can be assumed to be
 contiguous for our purposes, so you could view all the files in the
 project as one long chunk of data.  The directory and file names could
 be interpreted as statements in this data, analogous to in the context
 of somedirectory or in the context of somefile with sometype.  Any
 project configuration files could be viewed as declarative statements
 about contexts, such as in xyz context, ignore those or in abc
 context, any that is actually a this.  Imagine the compiler or
 interpreter is actually part of your program (which is reasonable since
 it doesn't do anything by itself).  Imagine the build management tool is
 also part of your program in pretty much the same manner.  Imagine that
 your program actually generates another program that will generate the
 program the machine runs.  I hope you can follow me here, and further I
 hope you can see that this is a completely valid description of what is
 actually going on (from a different perspective).
[...]
 What does pushing the abstraction point that far up provide?

I see why you are so hostile towards Joel Spolsky's criticism of 
Architecture Astronauts: you are one of them. Sorry Nathan, I don't know 
how you breathe that high up.

For what it's worth, your image of everything from the compiler on up is 
part of your program describes both Forth and Hypercard to some degree, 
both of which I have used and like very much. I still think you're 
sucking vacuum :(



-- 
Steven
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-29 Thread Nathan Rice
 He did no such thing. I challenge you to find me one place where Joel has
 *ever* claimed that the very notion of abstraction is meaningless or
 without use.

When great thinkers think about problems, they start to see patterns.
They look at the problem of people sending each other word-processor
files, and then they look at the problem of people sending each other
spreadsheets, and they realize that there's a general pattern: sending
files. That's one level of abstraction already. Then they go up one
more level: people send files, but web browsers also send requests
for web pages. And when you think about it, calling a method on an
object is like sending a message to an object! It's the same thing
again! Those are all sending operations, so our clever thinker invents
a new, higher, broader abstraction called messaging, but now it's
getting really vague and nobody really knows what they're talking
about any more. Blah.

When you go too far up, abstraction-wise, you run out of oxygen.
Sometimes smart thinkers just don't know when to stop, and they create
these absurd, all-encompassing, high-level pictures of the universe
that are all good and fine, but don't actually mean anything at all.

To me, this directly indicates he views higher order abstractions
skeptically, and assumes because he does not see meaning in them, they
don't hold any meaning.  Despite Joel's beliefs, new advances in
science are in many ways the result of advances in mathematics brought
on by very deep abstraction.  Just as an example, Von Neumann's
treatment of quantum mechanics with linear operators in Hilbert spaces
utilizes very abstract mathematics, and without it we wouldn't have
modern electronics.

I'm 100% behind ranting on software hype.  Myopically bashing the type
of thinking that resulted in the computer the basher is writing on,
not so much.  If he had said if you're getting very high up, find
very smart people and talk to them to make sure you're not in wing nut
territory I could have given him a pass.

I really wish people wouldn't try to put Joel up on a pedestal.  The
majority of his writings either seem like sensationalist spins on
tautological statements, self aggrandizement or luddite trolling.  At
least Stephen Wolfram has cool shit to back up his ego, Fog Creek
makes decent but overpriced debuggers/version control/issue
trackers... From my perspective, Stack Overflow is the first really
interesting thing Joel had his hand in, and I suspect Jeff Atwood was
probably the reason for it, since SO doesn't look like anything Fog
Creek ever produced prior to that.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-29 Thread Steven D'Aprano
On Thu, 29 Mar 2012 22:26:38 -0400, Nathan Rice wrote:

 He did no such thing. I challenge you to find me one place where Joel
 has *ever* claimed that the very notion of abstraction is meaningless
 or without use.
[snip quote]
 To me, this directly indicates he views higher order abstractions
 skeptically,

Yes he does, and so we all should, but that's not the claim you made. You 
stated that he fired the broadsides at the very notion of abstraction. 
He did no such thing. He fired a broadside at (1) software hype based on 
(2) hyper-abstractions which either don't solve any problems that people 
care about, or don't solve them any better than more concrete solutions.



 and assumes because he does not see meaning in them, they
 don't hold any meaning.  

You are making assumptions about his mindset that not only aren't 
justified by his comments, but are *contradicted* by his comments. He 
repeatedly describes the people coming up with these hyper-abstractions 
as great thinkers, clever thinkers, etc. who are seeing patterns in 
what people do. He's not saying that they're dummies. He's saying that 
they're seeing patterns that don't mean anything, not that the patterns 
aren't there.


 Despite Joel's beliefs, new advances in science
 are in many ways the result of advances in mathematics brought on by
 very deep abstraction.  Just as an example, Von Neumann's treatment of
 quantum mechanics with linear operators in Hilbert spaces utilizes very
 abstract mathematics, and without it we wouldn't have modern
 electronics.

I doubt that very much. The first patent for the transistor was made in 
1925, a year before von Neumann even *started* working on quantum 
mechanics.

In general, theory *follows* practice, not the other way around: parts of 
quantum mechanics theory followed discoveries made using the transistor:

http://en.wikipedia.org/wiki/History_of_the_transistor


The Romans had perfectly functioning concrete without any abstract 
understanding of chemistry. If we didn't have QM, we'd still have 
advanced electronics. Perhaps not *exactly* the electronics we have now, 
but we'd have something. We just wouldn't understand *why* it works, and 
so be less capable of *predicting* useful approaches and more dependent 
on trial-and-error. Medicine and pharmaceuticals continue to be 
discovered even when we can't predict the properties of molecules.

My aunt makes the best damn lasagna you've ever tasted without any 
overarching abstract theory of human taste. And if you think that quantum 
mechanics is more difficult than understanding human perceptions of 
taste, you are badly mistaken.

In any case, Spolsky is not making a general attack on abstract science. 
Your hyperbole is completely unjustified.



-- 
Steven
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Python is readable

2012-03-29 Thread Nathan Rice
 He did no such thing. I challenge you to find me one place where Joel
 has *ever* claimed that the very notion of abstraction is meaningless
 or without use.
 [snip quote]
 To me, this directly indicates he views higher order abstractions
 skeptically,

 Yes he does, and so we all should, but that's not the claim you made. You
 stated that he fired the broadsides at the very notion of abstraction.
 He did no such thing. He fired a broadside at (1) software hype based on
 (2) hyper-abstractions which either don't solve any problems that people
 care about, or don't solve them any better than more concrete solutions.

Mathematics is all about abstraction.  There are theories and
structures in mathematics that have probably gone over a hundred years
before being applied.  As an analogy, just because a spear isn't
useful while farming doesn't mean it won't save your life when you
venture into the woods and come upon a bear.

 and assumes because he does not see meaning in them, they
 don't hold any meaning.

 You are making assumptions about his mindset that not only aren't
 justified by his comments, but are *contradicted* by his comments. He
 repeatedly describes the people coming up with these hyper-abstractions
 as great thinkers, clever thinkers, etc. who are seeing patterns in
 what people do. He's not saying that they're dummies. He's saying that
 they're seeing patterns that don't mean anything, not that the patterns
 aren't there.

He is basically saying they are too clever for their own good, as a
result of being fixated upon purely intellectual constructs.  If math
was a failed discipline I might be willing to entertain that notion,
but quite the opposite, it is certainly the most successful area of
study.


 Despite Joel's beliefs, new advances in science
 are in many ways the result of advances in mathematics brought on by
 very deep abstraction.  Just as an example, Von Neumann's treatment of
 quantum mechanics with linear operators in Hilbert spaces utilizes very
 abstract mathematics, and without it we wouldn't have modern
 electronics.

 I doubt that very much. The first patent for the transistor was made in
 1925, a year before von Neumann even *started* working on quantum
 mechanics.

The electronic properties of silicon (among other compounds) is an
obvious example of where quantum theory provides for us.  We might
have basic circuits, but we wouldn't have semiconductors.

 In general, theory *follows* practice, not the other way around: parts of
 quantum mechanics theory followed discoveries made using the transistor:

You do need data points to identify an explanatory mathematical structure.

 The Romans had perfectly functioning concrete without any abstract
 understanding of chemistry. If we didn't have QM, we'd still have
 advanced electronics. Perhaps not *exactly* the electronics we have now,
 but we'd have something. We just wouldn't understand *why* it works, and
 so be less capable of *predicting* useful approaches and more dependent
 on trial-and-error. Medicine and pharmaceuticals continue to be
 discovered even when we can't predict the properties of molecules.

The stochastic method, while useful, is many orders of magnitude less
efficient than analytically closed solutions.  Not having access to
closed form solutions would have put us back hundreds of years at
least.

 My aunt makes the best damn lasagna you've ever tasted without any
 overarching abstract theory of human taste. And if you think that quantum
 mechanics is more difficult than understanding human perceptions of
 taste, you are badly mistaken.

Taste is subjective, and your aunt probably started from a good recipe
and tweaked it for local palates.  That recipe could easily be over a
hundred years old.  An overarching mathematical theory of human
taste/mouth perception, if such a silly thing were to exist, would be
able to generate new recipes that were perfect for a given person's
tastes very quickly.

Additionally, just to troll this point some more (fun times!), I would
argue that there is an implicit theory of human taste (chefs refer to
it indirectly as gastronomy) that is very poorly organized and lacks
any sort of scientific rigor.  Nonetheless, enough empirical
observations about pairings of flavors, aromas and textures have been
made to guide the creation of new recipes.  Gastronomy doesn't need to
be organized or rigorous because fundamentally it isn't very
important.

 In any case, Spolsky is not making a general attack on abstract science.
 Your hyperbole is completely unjustified.

The mathematics of the 20th century, (from the early 30s onward) tend
to get VERY abstract, in just the way Joel decries.  Category theory,
model theory, modern algebraic geometry, topos theory, algebraic graph
theory, abstract algebras and topological complexes are all very
difficult to understand because they seem so incredibly abstract, yet
most of them already have important applications.  I'm 100% positive
if you 

Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-29 Thread Nathan Rice
 Here's a thought experiment.  Imagine that you have a project tree on
 your file system which includes files written in many different
 programming languages.  Imagine that the files can be assumed to be
 contiguous for our purposes, so you could view all the files in the
 project as one long chunk of data.  The directory and file names could
 be interpreted as statements in this data, analogous to in the context
 of somedirectory or in the context of somefile with sometype.  Any
 project configuration files could be viewed as declarative statements
 about contexts, such as in xyz context, ignore those or in abc
 context, any that is actually a this.  Imagine the compiler or
 interpreter is actually part of your program (which is reasonable since
 it doesn't do anything by itself).  Imagine the build management tool is
 also part of your program in pretty much the same manner.  Imagine that
 your program actually generates another program that will generate the
 program the machine runs.  I hope you can follow me here, and further I
 hope you can see that this is a completely valid description of what is
 actually going on (from a different perspective).
 [...]
 What does pushing the abstraction point that far up provide?

 I see why you are so hostile towards Joel Spolsky's criticism of
 Architecture Astronauts: you are one of them. Sorry Nathan, I don't know
 how you breathe that high up.

 For what it's worth, your image of everything from the compiler on up is
 part of your program describes both Forth and Hypercard to some degree,
 both of which I have used and like very much. I still think you're
 sucking vacuum :(

We live in a world where the tools that are used are based on
tradition (read that as backwards compatibility if it makes you feel
better) and as a mechanism for deriving personal identity.  The world
is backwards and retarded in many, many ways, this problem is
interesting to me because it actually cuts across a much larger tract
than is immediately obvious.

People throughout history have had the mistaken impression that the
world as it existed for them was the pinnacle of human development.
Clearly all of those people were tragically deluded, and I suspect
that is the case here as well.
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-28 Thread Tim Delaney
On 25 March 2012 11:03, Tim Chase python.l...@tim.thechases.com wrote:

 On 03/24/12 17:08, Tim Delaney wrote:

 Absolutely. 10 years ago (when I was just a young lad) I'd say that I'd
 *forgotten* at least 20 programming languages. That number has only
 increased.


 And in the case of COBOL for me, it wasn't just forgotten, but actively
 repressed ;-)


2 weeks on work experience in year 10 (16 years old) was enough for me.
Although I did have a functional book catalogue program by the end of it.
Apparently the feedback was that if I'd wanted a job there I could have had
one ...

Tim Delaney
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-28 Thread Rodrick Brown
At my current firm we hire people who are efficient in one of the following and 
familiar with any another C#, Java, C++, Perl, Python or Ruby.

We then expect developers to quickly pick up any of the following languages we 
use in house which is very broad. In our source repository not including the 
languages I've already stated above I've seen Fortran, Erlang, Groovy, HTML, 
CSS, JavaScript, Mathlab, C, K, R, S, Q,  Excel, PHP, Bash, Ksh, PowerShell, 
Ruby, and Cuda.

We do heavy computational and statistical analysis type work so developers need 
to be able to use a vast army of programming tools to tackle the various work 
loads were faced with on a daily basis. 

The best skill any developer can have is the ability to pickup languages very 
quickly and know what tools work well for which task.

On Mar 22, 2012, at 3:14 PM, Chris Angelico ros...@gmail.com wrote:

 On Fri, Mar 23, 2012 at 4:44 AM, Steven D'Aprano
 steve+comp.lang.pyt...@pearwood.info wrote:
 The typical developer knows three, maybe four languages
 moderately well, if you include SQL and regexes as languages, and might
 have a nodding acquaintance with one or two more.
 
 I'm not entirely sure what you mean by moderately well, nor
 languages, but I'm of the opinion that a good developer should be
 able to learn a new language very efficiently. Do you count Python 2
 and 3 as the same language? What about all the versions of the C
 standard?
 
 In any case, though, I agree that there's a lot of people
 professionally writing code who would know about the 3-4 that you say.
 I'm just not sure that they're any good at coding, even in those few
 languages. All the best people I've ever known have had experience
 with quite a lot of languages.
 
 ChrisA
 -- 
 http://mail.python.org/mailman/listinfo/python-list
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-28 Thread Chris Angelico
On Thu, Mar 29, 2012 at 11:59 AM, Rodrick Brown rodrick.br...@gmail.com wrote:
 The best skill any developer can have is the ability to pickup languages very 
 quickly and know what tools work well for which task.

Definitely. Not just languages but all tools. The larger your toolkit
and the better you know it, the more easily you'll be able to grasp
the tool you need.

ChrisA
-- 
http://mail.python.org/mailman/listinfo/python-list


Re: Number of languages known [was Re: Python is readable] - somewhat OT

2012-03-24 Thread Tim Delaney
On 23 March 2012 06:14, Chris Angelico ros...@gmail.com wrote:

 On Fri, Mar 23, 2012 at 4:44 AM, Steven D'Aprano
 steve+comp.lang.pyt...@pearwood.info wrote:
  The typical developer knows three, maybe four languages
  moderately well, if you include SQL and regexes as languages, and might
  have a nodding acquaintance with one or two more.

 I'm not entirely sure what you mean by moderately well, nor
 languages, but I'm of the opinion that a good developer should be
 able to learn a new language very efficiently. Do you count Python 2
 and 3 as the same language? What about all the versions of the C
 standard?


Absolutely. 10 years ago (when I was just a young lad) I'd say that I'd
*forgotten* at least 20 programming languages. That number has only
increased.

Being able to pick up a new language (skill, technology, methodology, etc)
is IMO the most important skill for a developer to have. Pick it up
quickly, become proficient with it, leave it alone for a couple of years,
pick up the new version when you need/want it.

Tim Delaney
-- 
http://mail.python.org/mailman/listinfo/python-list


  1   2   3   >