On Nov 25, 2009, at 10:58 AM, Lindsay Marshall wrote:



e.g. a particular paradigm (e.g. imperative) makes sense to a given
individual, but then they get totally tripped up by the horribly
non-standard vs. math details (cf. limits on ranges of values).


Which is what I said earlier about exception cases. But in reality most people hardly ever encounter them - yes, it is true that computer arithmetic is weird, but I honestly cannot remember the last time I was doing arithmetic with numbers that could have generated those kind of exceptions.

The numbers I was talking about are
 - ordinary 32-bit integers, and
 - ordinary 64-bit floats
such as you find in most programming languages.
"numbers that could have generated such exceptions" are found everywhere.

I'm old enough to have used Burroughs Algol, where the hardware always
reported integer overflows and array bounds errors.  I've also used
Pascal compilers that by default checked for such things.  And it
was astonishingly easy to run into trouble, but at least you found out
that you had trouble. (Dijkstra's "hopefully sufficiently large machine.")

I had terrible trouble doing quite routine arithmetic in C on a PDP-11
because ints were 16 bits and there were no runtime checks.  Just last
week, I ran into a bug in a C program on a 32-bit machine because a
multiply was overflowing quietly and giving crazy answers, and it wasn't
doing anything more exotic than integer adds and multiplies with a mod
here and there.

I don't know what programming language Lindsay Marshall has been using,
but unless it guarantees to report arithmetic overflows, how would the
programmer *know* whether there were problems or not?  As for floating-
point arithmetic, I've known a student's program run nearly 100 times
slower than he expected because of a fact about floating point he
didn't know, and in a standards-related discussion about a certain
programming language, I was able to point out that the definition of
"<" wasn't transitive, thanks to floating point, which explained why
a really simple sorting algorithm wasn't working.

I think this is a complete red herring. But there again so is the whole idea of intuitiveness.

A "red herring" is "a deliberate attempt to divert attention".
Are you accusing me of deliberately trying to divert your or
anyone else's attention?  If so, from what?

Just a reminder:  *I* am making no claims of intuitiveness for anything
at all.  I think it is just as absurd to call "the whole idea of
intuitiveness" "a complete red herring" as it would be to call "the
whole idea of fairness" "a complete red herring".  FEELINGS of fairness
and intuitiveness exist, and CLAIMS of fairness and intuitiveness exist,
and while there may not be *complete* agreement between people, these
feelings and claims are shared.

For example, I'm one of many people who feel that a flat tax is
*obviously* unfair.  But there are many other people who I share
many political ideas with who believe that it is *obviously* the
fairest possible thing.  That doesn't mean that my feelings or
claims (or theirs!) are "red herrings", or that investigating
causes or reasons for such feelings or claims would be unscientific,
or that such feelings or claims are based on no objective facts at
all.

Question : do you think that using recursion is intuitive?

Thanks to learning about it from Barron's "Recursive Techniques in
Programming" and early exposure to Algol, and even earlier exposure
to definition by recurrence in mathematics, yes, very much so.

I have seen people use it without having been introduced to it (and try to use it on old Fortran systems - ooops), but I have also been people entirely perplexed by it. So which is it?\

You don't trick me that easily.

SOME people find it intuitive.
SOME do not.
Presumably there are reasons for this,
and presumably we would be better off finding out why
than damning the whole enterprise.

Reply via email to