Maybe it is time to rewrite K-12 math books to be in line with computational
It would be great. Imagine dad at the dining room table trying to explain to
little Johnny... "100 Cheerios are here, and if we add another, we have -100
Sorry, I couldn't help myself.

With respect to such problems, I spent the usual amount of time in college
studying various complexities in arithmetic on computers. Yet, I have only
seen problems crop up three times over 10 years of full-time programming
experience. For 'most' programmers, the complexities of computational
arithmetic don't impact us on a daily basis. And, when these problems occur,
they often result in obviously wrong results (as opposed to believable

Thus, if one were to study an aspect of this problem, I would think it would
be better to focus on a higher-impact issue (not that arithmetic doesn't
have worth).

On Tue, Nov 24, 2009 at 4:35 PM, Richard O'Keefe <> wrote:

> On Nov 24, 2009, at 9:35 PM, Derek M Jones wrote:
>  Brad,
>>  like i said, i'm not sure intuition exists....
>>> What's quite certain is that *claims* of intuitiveness exist.
>> But do they only exist as a reason for justifying the use of
>> one particular language?
> I don't think so.  For one thing, in the recent thread that got me
> started on this, other people were recommending a whole range of
> programming languages (Java, C#, Python, AWK, even PERL).  For
> another, when people try to justify one particular language, there
> are lots of other reasons they can and usually do oofer.
> I believe that when people say things like "imperative programming
> is more intuitive than [whatever]" they mean _at least_ the
> following things:
>  1 I learned imperative programming with only a modest amount
>   of trouble or no trouble at all.
>  2 I was able to transfer what I learned to other imperative
>   languages with little or no trouble.
>  3 I find [whatever] much harder to understand.
>  4 I know a lot of other people who feel the same.
>  5 I do NOT know many people (or even any at all) who came from
>   [whatever] to imperative programming and found it hard to
>   understand.
>  6 The experienced difficulty of [whatever] is not a defect in
>   us or our education but a defect in [whatever].
> For the speakers, 1-5 are facts and 6 is felt to be justified by
> those facts.  The possibility of selection bias (people who would
> have been more comfortable learning Haskell or Miranda first
> very seldom get the chance, and leave the field, so we never get
> to hear their opinions about intuition and programming languages)
> is rarely considered.
>  How could students tell the difference between having problems
>> programming and having problems using a particular kind of
>> language?  Perhaps this distinction is not important, they
>> could simply try another approach and see if it makes any
>> difference.
> That's indeed an operational way of telling the difference.
> That suggestion of mine was not just a half-baked idea, it was
> just set out in the sun for a minute or two.  It would not be
> easy to set up or administer.
> By the way, there's a service paper here for people who want to
> be surveyors.  It covers trigonometry, statistics, a couple of
> other topics, and some programming.  The surveying department
> insisted that the language taught be Visual Basic (more precisely,
> Visual Basic for Applications, inside the Excel spreadsheet).
> I was the only computer science lecturer willing to be involved
> with it.  I only have five one-hour lectures to teach the
> elements of programming.
> I *KNOW* the thing is impossible.
> I spend one lecture explaining that and why computer arithmetic
> does not behave the way they expect arithmetic to behave, for
> example that you can find a number X such that X + 1 < X
> and numbers X, Y, Z such that X+(Y+Z) differs from (X+Y)+Z
> and numbers X Y both different from zero such that X*Y = 0.
> I spend half of another lecture telling them that they need to
> write down what their functions are supposed to do and to TEST
> their functions to make sure that they do.
> When you stop and think about it, computer arithmetic is
> *stunningly* unintuitive, IF your intution is based on the
> laws of whole numbers and fractions learned at school and the
> laws of the real and complex numbers learned in first year
> mathematics at university.
> I wonder if the question of "intuitiveness" could be studied
> at the level of arithmetic rather than programming as a whole.
> For example, Smalltalk counts as OO-imperative, but has
> bignum and ratio arithmetic built in and standard:  6/4 gives
> the answer 3/2, not 1.5.  Java _has_ bignum arithmetic, but
> doesn't let you use ordinary notation with it.  And so on.

Reply via email to