On Nov 23, 2007 5:56 AM, Waldemar Kornewald <[EMAIL PROTECTED]> wrote:
> On Nov 23, 2007 1:36 AM, Wm Annis <[EMAIL PROTECTED]> wrote:
> > *Everything* we do to communicate with computers is pretty unnatural.
> > I don't see how learning different precedence rules to program is any
> > different from learning that "if (a == 3 or 4 or 5) ..." doesn't mean the
> > same thing to a computer that it means in English.
>
> Is this an excuse to make computer suck? What are we talking about here?

I'm saying there's nothing at all simple or natural about math
precedence rules.  Only our familiarity with certain kinds of
programming languages make it seem so.

> Let's stop thinking in terms of implementation complexity (math
> precedence won't add much, anyway) and start thinking in terms of how
> to make computers easier, more natural, and less error-prone for
> end-users (in this case, programmers using COLA).

And I repeat my question from before — how often are precedence
confusions *really* a problem?  Is there any data on this at all, or
are we just arguing from theoretical stances about what is "easier,
more natural and less error-prone" in programming?

Code transformation seems a pretty important part of the COLA
system.  Now, as a Lisp fan I obviously have very warm feelings
about this sort of thing.  Introducing arbitrary precedence rules for
certain kinds of operators will not make code transformation easier
or less error-prone.  Complexity of use will simply be paired with
complexity of implementation.

-- 
William Annis
www.aoidoi.org

_______________________________________________
fonc mailing list
[email protected]
http://vpri.org/mailman/listinfo/fonc

Reply via email to