On Nov 26, 2009, at 12:40 AM, Lindsay Marshall wrote:
Ach, far too many posts to respond to in detail! So a summary.
1) I think that the idiosyncracies of computer arithmetic are
irrelevant to the idea of "intuition".
I cannot understand why.
The original claim that I'm responding is that "imperative programming
is
more intuitive than [declarative]" which was explicated as people being
better able to understand HOW IT WORKS. My counter-claim is that it
appears that people DON'T really understand how computer memory and
computer arithmetic work.
Take a single simple C statement:
x++;
Unlike Lisp or Smalltalk, this works with *computer arithmetic*.
If you don't understand computer arithmetic, you don't understand
what this tiny statement does, and whatever "intuitive" means,
something you don't actually understand cannot be "intuitive".
On my laptop, the loop
for (int32_t i = 0; i >= 0; i++) {}
stops in 0.85 seconds.
It gets worse. The corresponding
integer :: i
i = 0
do while (i >= 0)
i = i+1
end do
print *, i
compiled with 'gfortran' 4.2 stops in a few seconds,
but compiled with 'gfortran -O4' it loops forever,
and from a reading of the standard, I think the compiler may be
within its rights.
You can look at programming languages either in the context of a
particular implementation on a given machine or just as a language
on its own.
*Some* languages you can. You can look at Scheme that way (provided you
forget about floating point arithmetic). You can look at Smalltalk that
way (provided you forget about floating point arithmetic). You can look
at Lisp and (some versions of) Prolog that way (provided you forget
about
floating point arithmetic). You can look at Haskell (provided you
remember
to always use Integer rather than Int, except for some built in
functions
you can't) and SML (provided you use IntInf rather than Int, except for
some built in functions you can't) (provided you forget about floating
point arithmetic).
The programming languages we are actually concerned with are the ones
that run on SOME machine. Different languages let the underlying
machine
show through to different degrees. They pretty much all let floating
point arithmetic show through. Most of them let the boundedness of
integer arithmetic show through.
Let's take Java as an example of a language which is *supposed* to be
platform-independent. It achieves this by demanding wrap-around 32-bit
and 64-bit integer arithmetic and IEEE floating point. That "language
on its own" has internalised the gotchas of current hardware and made
them PART OF THE LANGUAGE MODEL.
This has got to be relevant to the psychology of programming in
that it relates to what you have to understand to be confident of
writing a working program.
I think that statements about intuitiveness are made wrt to the
latter case and people are not assuming particular implementations.
I'll go along with that, provided it's clearly understood that this is
HARDER for people to cope with, because in order to write a working
program they have to consider a RANGE of actual or plausible particular
implementations. And frankly, most people's idea of a range is "what
brand of PC is it".
2) I don't believe that I have fallen foul of overflows or other
arithmetic traps frequently - I simply don't write programs that use
numbers that fall into the ranges where overflows are possible, nor
do most people in my experience.
So you never use multiplication? You never read numbers from a file?
I avoid using floating point when I can as there are far too many
issues with that, but equally the number ranges I deal with are not
a problem anyway. The last time I encountered a problem of this kind
that I can recall was many years ago (and in someone else's code)
and was to do with the fact that char was signed on a PDP-11 and
they were using extended character sets.
Char continues to be signed in many C compilers. Savour the irony:
the one
integral data type that *can't* be safely used to hold a character in
C is
the one called 'char'. (Yet you have to, because of all those library
functions that want char *.)
The existence of a small number of high profile errors that are all
very well known indicates to me that the problem is much less severe
in practice just because these are high profile. Array bounds are a
different matter.
Who said it was a small number?
Who said the high profile ones were the only ones that occur?
According to
http://www.cert.org/secure-coding/integralsecurity.html
"Integers represent a growing and underestimated source
of vulnerabilities in C and C++ programs."
The CERT secure coding guidelines have this to say about
integer overflow:
Risk Assessment
Integer wrap can lead to buffer overflows and
the execution of arbitrary code by an attacker.
Severity: high
Likelihood: likely
Remediation cost: high
Quite a few issues have been reported, XDR, IPP, CUPS, Wireshark,
Firefox, OpenSSH, the Linux kernel.
3) I don't think I could design an intuitive programming language!
You could surely design a LESS intuitive programming language.
Does anyone seriously contend that Intercal is less intuitive
than the majority of programming languages? For English-speakers,
a language in which all the keywords were in the Chinese script?
A language in which integer arithmetic was always correct unless
the result would have been a prime number? A language where
parentheses are reversed, as in )x+1(*)x-1(?
Now that we've established that "intuitive" is something that's
more or less rather then yes or no, surely it makes sense to ask
_what_ makes things more (or less) intuitive to _whom_?
A programming language is a just another user interface and I don't
think that "intuition" makes sense in any user interface.
Familiarity, as someone else has said, yes, and the ability to
generalise, both of which look like intuition but are actually not.
Nobody (sensible) claims that, say, French is more intuitive than
Chinese (and let's not get into Loglan v Esperanto.....).
I will say flatly that French is FAR more intuitive >>TO ME<< than
Chinese,
and I say that as someone who learned French at school and who has tried
and failed to learn Cantonese. I can't cope with Cantonese because I
can't
pronounce it. Given suitable instruction, I dare say I'd pick it up,
the
point is that I _didn't_ need to learn about tones with French. I can
also
say that Esperanto is FAR more intuitive >>TO ME<< than Lojban. We
have a
Lojban enthusiast here, and I did work through several lessons. Just as
English and French have a substantial common vocabulary, so do Esperanto
and English (and French). Esperanto _works_ the way I expect a natural
language to work and uses a lot of words I can understand.
I thought we'd got past this point some time back. I know I wrote that
"intuition" is culture-specific. That doesn't even begin to mean that
we can't talk about it. Just as I can very easily point to reasons why
French and Esperanto are much more intuitive TO ME than Chinese and
Lojban, so we should be able to find out what makes one language or
paradigm feel "intuitive" TO SOME PEOPLE.
I don't think even APL's admirers have ever claimed that it was
(initially) intuitive. One reason for that is an entire absence
of spelled out keywords. Another is its right-to-left execution
model. Another is the lack of operator precedence. What has been
claimed for APL is that it's a remarkably handy notation once you
have mastered it. I've looked at J. I find J really hard to get
my head around, and one reason for that is that it uses familiar
characters in unfamiliar ways. At least APL used unfamiliar symbols!
4) My point about passion wrt languages is quite likely close to the
idea of intuition and I have never said that it is not worth
investigating these things - it's very important to investigate them
as it tells us a lot about how we should be designing things. But I
am extrememly wary of the whole notion of "intuition". Let's call it
something else that is less contentious and then we will all
instantly agree. At least, that is my intuition.
One of the reasons I found it harder to locate CERTs about integer
overflow than it should have been is that they like to call it
integer wrap. In a way they are right: integer wrap is what you
get when there *should* have been an integer overflow exception
but there wasn't; it's NOT having the overflow exception that's the
problem. But every else calls it integer overflow.
The Wiktionary entry for "intuitive" says
automatic, without requiring conscious thought;
easily understood or grasped by intuition
and "intuition" sense 1 is
immediate cognition without the use of conscious
or rational processes.
This seems like EXACTLY the proper word for the concept.
People are talking about how well they understand something
without having to consciously puzzle over it.
You can call it nothinknowability if you like, but is it just
the word that's the problem?