Christian Sievers replies to John Hughes:
> > Some suggest that it is enough for compilers to issue a warning when using
> > call-by-name. I disagree strongly. Such a warning may alert the programmer
> > at the time the overloaded definition is compiled. But programmers need to
> > understand programs at other times also. The person reading through the code
> > of a library, for example, trying to understand why a program using that
> > library is so slow or uses so much memory, will not be helped by warnings
> > issued when the library was compiled. The distinction between call-by-need
> > and call-by-name is vital for understanding programs operationally, and it
> > should be visible in the source.

> In a library I'd really expect to see a big comment when such a thing
> happens. 

I think John's point is that nothing _forces_ the library writer to
do this, whereas the MR does -- in a rather crude way, IMO.  Whilst
I'm not 100% convinced, I could accept some sort of such compulsion,
just so long as it's possible to say either of the two possible "DWIM"s
reasonably concisely, and I can anticipate and trap when I'm going to
be bitten on the bum by the MR more readily than at present.


> >     pi := 4*arcsin 1
> > 
> > is a monomorphic definition at a particular instance which (visibly) does 
not.

> But which instance? In this case the default mechanism can give the
> answer, but in general, you would have to give a type unless `e'
> already has a monotype. So you could use `x:=e' without a signature
> exactly when you now could use `x=e' without one. 

That's the point, isn't it?


> > * Monomorphism is decoupled from overloading. With this proposal, x := e is
> >   always a monomorphic definition, whether the type of e is
> >   overloaded or not.

> Again: how can this be?

Because the MR is used on such definitions.  Hence, as with simple
bindings with no types declarations in Haskell 1.x (&98), they're
forced to be monomorphic, on pain of a really confusing type error
someplace else in your program.  OK, I'm being facetious -- it should
be possible to reverse-chain type errors to determine whether
application of the MR is a possible cause, but that's working our
compiler writers pretty hard for a rather small 'nut'.

Would an alternative be to require that ':='-definitions have some
(or all) type information added in explictly?  One might require
that they have explict signature, but all that's really required
is to in some way specify to which monotype(s) on is specialising.


> > * When converting Haskell 1.x to Haskell 2, many := would need to be 
inserted.
> >   Failure to do so could make programs much less efficient. An (optional)
> >   compiler warning could help here.

> I don't see this. Or do you want to always recalculate any value
> defined with `=' instead of `:=' ?

The '=' definitions would be fully polymorphic, and hence there's
the possibility that they might be more overloaded than the programmer
realises, and in a way that the compiler may not field as well as
with a monomorphic definition.  (Or at all, really.)  But not
making them _illegal_ by coup de main leaves open the possibility
the the hideous inefficiency ain't so hideous after all (my usual
experience, frankly), or that an aggressive compiler can Make It Go
Away.

Slan libh,
Alex.



Reply via email to