Re: fundeps question

2002-12-17 Thread Jeffrey R Lewis
On Monday 16 December 2002 18:18, Ashley Yakeley wrote:
 In article [EMAIL PROTECTED],

  Hal Daume III [EMAIL PROTECTED] wrote:
  I spent about a half hour toying around with this and came up with the
  following, which seems to work (in ghci, but not hugs -- question for
  smart people: which is correct, if either?)...

 Both are correct. Hugs fails (correctly) because it doesn't have
 anything like -fallow-overlapping-instances.

Urr... I think hugs invented overlapping instances - or at least Mark Jones did...  
;-)

Try `+o'.

--Jeff
___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: Hugs98 Oct2002 release candidate available

2002-10-22 Thread Jeffrey R Lewis
On Monday 21 October 2002 20:29, Sigbjorn Finne wrote:
 A new release of Hugs98 is just around the corner, introducing
 a number of worthwhile extensions and enhancements (see
 announcement blurb at the end of this message.)

 To help us flush out any remaining issues/bugs, a release candidate
 is now available for test use. The dev. team is very keen on
 having Hugs users try out this new release, and report back any
 issues they may into to [EMAIL PROTECTED]

 The release candidate is available in source form (and for the
 benefit of Win32 users, a binary distribution) from

 http://cvs.haskell.org/Hugs/downloads/Oct2002/

And of course, some tasty rpms are also available there.

--Jeff
___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: Newbie question, installing hugs on redhat 7.2

2002-08-19 Thread Jeffrey R Lewis

On Monday 19 August 2002 04:33 am, Alastair Reid wrote:
  Does anyone know a workaround for this, or please tell me if I'm
  just doing something stupid.

 It sounds like there's a problem in the RPM package since Hugs itself
 isn't that fussy.  You could install Hugs from source yourself.

Alternate approach, which works in general for RPM, assuming you have access 
to src packages:

Download the src RPM from the hugs web page 
http://cvs.haskell.org/Hugs/downloads/hugs98-Dec2001-1.src.rpm

Then, rebuild the package.  This will cause the dependencies to be calculated 
based on what you have on your machine.

# rpm --rebuild hugs98-Dec2001-1.src.rpm
# rpm -Uvh /usr/src/redhat/RPMS/i386/hugs98-Dec2001-1.i386.rpm

(Newly built packages get dumped in /usr/src/redhat/RPMS, you can change that 
directory, but it's probably not worth it for a one-off)

--Jeff
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe



Re: core language external representation

2002-02-25 Thread Jeffrey R Lewis

On Monday 25 February 2002 02:55 am, Simon Peyton-Jones wrote:
 Not yet.  But Jeff Lewis is (I believe) planning to work actively
 on this.

Well put. I plan on working on this, but no sooner than mid-march.

--Jeff
___
Glasgow-haskell-users mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/glasgow-haskell-users



Re: syntax...(strings/interpolation/here docs)

2002-02-13 Thread Jeffrey R Lewis

On Wednesday 13 February 2002 06:36 am, C.Reinke wrote:
   Does anybody with their elbows in the
   code think variable interpolation and/or
   multi-line strings are good/doable ideas?
   Would this be the sort of change one could make
   without a lot of previous familiarity with
   the implementation of Hugs/Ghc?

 Unlike my rough proposal, one should aim for a combination of
 (mostly) in-Haskell implementation and (some) pre-processing.  As
 Thomas Nordin has pointed out to me in private email, Hugs (Dec
 2001) already supports this (module lib/hugs/Quote.hs and flag +H).

 The real trick is to have the same support in all implementations..

I use here docs quite a bit.  They are wonderful for writing error messages 
that are also readable in the code ;-)

The point about same support in all implementations is of course a good one.  
Thomas and I are the culprits who put here docs in hugs in the first place.  
However, it is just as easy to support here docs using a pre-processor.  I 
have a medium sized project that uses here docs, and can be used under both 
hugs and ghc.  With hugs, I use the builtin feature, of course.  With GHC, we 
just use a pre-processor.  This is a bit awkward with GHC 5.02 and earlier 
versions, but starting with 5.03, GHC now has a proper interface for hooking 
in a pre-processor (don't know the details, bug Sigbjorn says it's in there). 
 A convenient feature of here docs that makes it easy to implement as a 
pre-processor is that you can do the unhere doc translation so that it 
preserves line numbers.  The only drawback to using a pre-processor is that 
it probably won't work with ghci (but then you probably don't need to write 
here docs at the command line either!).

--Jeff
___
Haskell-Cafe mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell-cafe



Re: Simpler Fibonacci function

2002-02-05 Thread Jeffrey R Lewis

On Tuesday 05 February 2002 09:40 am, Brian Berns wrote:
 I am new to functional programming and teaching myself Haskell.  The
 canonical Haskell fib function (e.g. as presented in the Gentle
 tutorial) is:

fib = 1 : 1 : [ a+b | (a,b) - zip fib (tail fib) ]

 This seems, to be polite, a bit overly complex.  By comparison, here
 is a simpler version:

As an aside, here's a nicer way of writing the stream version of fib:

fib = 1 : 1 : [ a + b | a - fib | b - tail fib ]

This gets rid of the distraction of the zip and the pair, letting you see 
the simple structure of the definition more clearly.

This, however, is not Haskell 98 (the use of multiple generators separated 
by `|').  But it is supported by both GHC and Hugs (using flag -98).  See 
the sections in the user manuals under `parallel list comprehensions'.

--Jeff
___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: Implicit Parameters

2002-02-04 Thread Jeffrey R. Lewis

On Monday 04 February 2002 01:58 am, Koen Claessen wrote:
 Hi all,

 Now we are talking about implicit parameters, let us take up
 the following problem with them on the Haskell mailing list
 too.

 [implicit parameters are not propogated down recursive definitions without 
 a type signature]

 My questiona are: Were the designers of the implicit
 parameters paper aware of this problem when they wrote the
 paper? If so, they probably did not think this was a big
 problem. Do people in general think this is a problem?

I think we overlooked it when the paper was written, but I reported this at 
the Haskell Implementers Meeting in Egmond.  At the time the only solution 
that occurred to me was to essentially do type inference twice - the first 
time to figure out what implicit parameters the definition depends on, and 
the second time with a weak signature provided that has those implicit 
parameters on board to get the effect of the user having provided the 
signature.  I believe you guys have looked at something like this as well.  
But I find that solution fairly unsatisfactory, and have been hoping 
something nicer will come along

I consider this to be a problem, but not enough of one that I've managed to 
spend time finding a solution ;-)

--Jeff
___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: Implicit Parameters

2002-02-04 Thread Jeffrey R. Lewis

On Monday 04 February 2002 02:25 am, John Hughes wrote:
 Not so fast! Proposing a solution means this is regarded as a problem!
 But what is to say that the first behaviour is right in any general
 sense?

 The important thing is that the language semantics is clear, and this is a
 semantic question.  The static semantics of Haskell *is* clear: recursive
 calls are monomorphic unless a type signature is given; this is the basis
 for understanding the behaviour above. 

I think part of the problem is that we've confused implicit parameterisation 
with polymorphism.  Haskell has a two-level type system with monomorphic 
types at the bottom level, and polymorphic and qualified types at the second 
level.  It turned out to be very straightforward to add implicit parameters 
to Haskell by treating them as a special kind of qualified type, and thus 
they also play according to the rules of polymorphic types - i.e. you 
`capture' implicit parameters exactly when you generalize a polymorphic 
definition.

However, Koen's example suggests that maybe implicit parameters shouldn't 
necessarily play according to the rules of polymorphic types.  Perhaps 
implciit parameters should be a special kind of monomorphic type instead.  If 
this were the choice, then it's clear that they should be captured by 
recursive definitions.

 When implicit parameters are used,
 it's very important to be aware whether a binding is monomorphic or not
 (can't resist plugging := again!). Will your solution make understanding
 when a binding is monomorphic simpler? If not, it could be worse than the
 problem -- and the fact that it makes this one example behave as you want
 is no justification.

I agree that we should tread carefully ;-)

--Jeff
___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: Incoherence

2001-10-24 Thread Jeffrey R Lewis

John Hughes wrote:

 What we need is different binding syntax for monomorphic and polymorphic
 bindings. Roll on := and = ...

I agree absolutely that we need such a distinction.  Although it's worth clarifying a 
point.  The monomorphism restriction doesn't exclude polymorphism, just overloading, 
e.g. I can bind `x = []' with no problem.  So is your proposal to have := bind 
non-overloaded (and non-implicitly parameterized terms), or do you really mean that it 
will only bind monomorphic terms? (thus `x := []' would be problematic, unless there's 
a type sig).

--Jeff


___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: ghc-5.02/libreadline.so.3

2001-09-24 Thread Jeffrey R Lewis

Ch. A. Herrmann wrote:

  Simon == Simon Marlow [EMAIL PROTECTED] writes:

 Simon The right thing to do on such a box is to use the RPM or
 Simon debian package, if one is available.

 For the reason below, RPM is not a solution for us. Currently, I'm
 compiling GHC with a modified LD_LIBRARY_PATH and until now nothing
 seems to be wrong. Thanks for the help.

 As far as I know, RPM packages are installed into a particular
 directory, e.g., /usr/local. However, at our department, we have
 an installation concept that permits us to have different versions
 of the same software installed and therefore uses a particular
 directory. Other good reasons for that are (1) to avoid damage of our
 main system by software from outside the OS distribution and(2) to
 exclude software from public use which was not properly installed
 or harmonize with the installed system.

 Cheers
 Christoph

RPMs are normally installed into a specific location.  However, you can also construct 
a location-independent RPM that can be installed anywhere, but you have to write the 
RPM spec file in a certain way to be able to build a location-independent RPM.  It's 
probably worth someone taking the time to tweak the spec file, since I can imagine 
many users might be in the same situation.

--Jeff


___
Glasgow-haskell-bugs mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs



Re: Application letters at the Haskell workshop: suggestion

2001-09-13 Thread Jeffrey R Lewis

Lennart Augustsson wrote:



 I have been writing substantial Haskell programs and I use *NO* experimental
 features.  What I'm currently working on is over 2 lines of Haskell 98.
 No extensions whatsoever.  (It even compiles and runs with all available
 Haskell implementations.)
 Granted, I write mostly compiler like programs (files in, files out), but there
 is still a lot you can do in Haskell just as it is.  Sometimes it might require
 bending slightly backwards to get it done, though.

Well, that's nothing!

I have been writing substantial ANSI C programs and I use *NO* experimental
features.  What I'm currently working on is over 2 lines of ANSI C.
No extensions whatsoever.  (It even compiles and runs with all available
C implementations.)
Granted, I write mostly compiler like programs (files in, files out), but there
is still a lot you can do in ANSI C just as it is.  Sometimes it might require
bending slightly backwards to get it done, though.

;-)

--Jeff


___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: Implict parameters and monomorphism

2001-04-26 Thread Jeffrey R. Lewis

Simon Peyton-Jones wrote:

 This is a long message about the design of implicit parameters.
 In particular, it is about the interaction of the monomorphism
 restriction with implicit parameters.  This issue was discussed
 in the original implicit-parameter paper, but I wanted to articulate
 it afresh and propose some design choices

 Simon


 Question 3: monomorphism
 
 There's a nasty corner case when the monomorphism restriction bites:

 z = (x::Int) + ?y

 The argument above suggests that we *must* generalise
 over the ?y parameter, to get
 z :: (?y::Int) = Int,
 but the monomorphism restriction says that we *must not*, giving
 z :: Int.
 Why does the momomorphism restriction say this?  Because if you have

 let z = x + ?y in z+z

 you might not expect the addition to be done twice --- but it will if
 we follow the argument of Question 2 and generalise over ?y.

It may help to clarify the question by giving a concrete semantics to the monomorphism 
restriction.  I'll focus on a simplification it that will suffice: a binding is 
restricted if it is of the form `x = E'.  The monomorphism restriction can then be 
understood by the following translation:

[[ let x = E1 in E2 ]] = (\x - E2) E1
[[ let f p1 .. pn = E1 in E2 ]] = let f = \p1 .. pn - E1 in E2 ]]

where `let' in the translation is a straight polymorphic let - no more funny business 
w/ mono restrictions, etc.

In other words, the monomorphism restriction converts certain let bindings to lambda 
bindings.  The typing and sharing consequences follow from this translation.  This is 
a bit naive, because a restricted binding can still be polymorphic over 
non-constrained variables (where constrained here means constrained by a type 
class predicate), but this just requires a more detailed translation to get right, it 
doesn't harm the main point.

The example above becomes:
(\z - z + z) (x + ?y)

Given this semantics, which design choice makes the most sense?

(A) Doesn't seem to make sense.  Under the translation, there's no basis for rejecting 
the binding - you've simply got a lambda binding, and generalization doesn't come into 
play.

(B)  Follows naturally from the translation.  A restricted binding is a lambda 
binding, and thus isn't generalized.  An unrestricted binding is a polymorphic let 
binding, and thus *is* generalized.

(C) Doesn't immediately follow from the translation because the criteria for a 
restricted binding has become more complex.  Indeed, it's no longer syntactic - we 
have to do some inference before deciding whether a binding is restricted or not.  Not 
necessarily a bad thing, it just makes the whole thing more complex.  But once you've 
extended the definition of a restricted binding to exclude ones with implicit 
parameters, everything follows as a consequence of the translation, thus it is a 
coherent design choice.

My preference?  B is a straightforward consequence of the monomorphism restriction (as 
understood by my translation).  Hence, that's how hugs does it, and how GHC 4.08 does 
it.  GHC 5.00 adopted A, but I don't think that will last for long.  C might be nice.  
I don't have strong preferences.

But what about Simon's point about inlining?  I believe it's a bit misleading in the 
following sense: if you understand the monomorphism restriction to restrict certain 
bindings to be lambda bindings, then inlining of restricted bindings is beta 
reduction.  Beta reduction in the face of implicit parameters holds, but involves a 
bit more work, and thus shouldn't be done naively (see the POPL paper).  Bottom line: 
inlining can be done in the case of B, but requires more care.  Incidentally, the 
compiler itself needn't worry about this, because it does all of its transformations 
after the dictionary translation (where all the implicit parameters have been 
translated to explicit parameters).

As a side note: interactions between the monomorphism restriction and defaulting can 
also break inlining.  You already have to be careful.

But I don't really want to oversell that point.  I think option C might be nice, 
because it would make it easier to do source code inlining.  But we could also achieve 
this by just abolishing the monomorphism restriction!  So I'm a little torn by C.  I 
see the benefits, but it complicates the already complicated monomorphism restriction, 
and I just can't help but think that if we're going to go that far, we'd be better off 
just getting rid of the monomorphism restriction, and making this whole thing a 
non-issue.  If we're not willing to get rid of the monomorphism restriction, I'd stick 
with B.

--Jeff



___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: implicit-parameters paper

2001-04-25 Thread Jeffrey R. Lewis

S.D.Mechveliani wrote:

 Simon P. Jones mentions some paper on implicit parameters
 in his recent letter on Implicit parameters and monomorphism.

 Please, where to find this paper?

You can slurp one up from here: http://www.cse.ogi.edu/~jlewis/

--Jeff


___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: Syntax for implicit parameters

2001-04-20 Thread Jeffrey R. Lewis

Simon Peyton-Jones wrote:

 I only added 'with' because I did not want to steal *two* new keywords.
 One is bad enough!   I proposed using 'let' (not dlet), with the '?' to
 distinguish dynamic from lexical bindings, but did not achieve
 consensus.


I only added `with' to GHC originally because `dlet' was essentially deprecated 
(although I never bothered to remove it from hugs).


 Lack of consensus = the status quo stays.

 My order of preference:

 1. [happy]. Use 'let'
 2. [consent].  Use 'dlet' or 'with'
 3. [hate]  Use both 'dlet' and 'with'

 Would the Hugs folk be willing to adopt (2)?

That would certainly be fine by me.

--Jeff


___
Glasgow-haskell-bugs mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs



Re: Syntax for implicit parameters

2001-04-20 Thread Jeffrey R. Lewis

"Manuel M. T. Chakravarty" wrote:

 "Jeffrey R. Lewis" [EMAIL PROTECTED] wrote,

   Lack of consensus = the status quo stays.
  
   My order of preference:
  
   1. [happy]. Use 'let'
   2. [consent].  Use 'dlet' or 'with'
   3. [hate]  Use both 'dlet' and 'with'
  
   Would the Hugs folk be willing to adopt (2)?
 
  That would certainly be fine by me.

 What exactly does (2) imply?  Does it mean we get `with'
 back or not?

I'm afraid I misspoke.  I meant (2) with `with'.  Sorry ;-)  I'm happy to nuke `dlet'.

--Jeff


___
Glasgow-haskell-bugs mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs



Re: GHC from CVS

2001-04-10 Thread Jeffrey R. Lewis

Marcus Shawcroft wrote:

 Ralf Hinze wrote:

  It compiles just fine except that ghci is not built:
   ghci
  ghc-5.00: not built for interactive use
 
  Do I have to specify this explicitly?
 
  Cheers, Ralf

 Hi

 ghci is not built unless you are compiling with ghc 4.11 or better. Try
 rebuilding ghc using the 5.00 version that you have just built.


And to get a usable system, I've found that you have to compile it with
itself again - i.e. compile it three times.  Unless I bootstrap to stage
3, recompilation is hopelessly confused: compile a program from scratch -
no problem, but make a simple modification, then recompile, and you're
likely to get all kinds of strange linker errors.

--Jeff



___
Glasgow-haskell-bugs mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs



Re: Yet more on functional dependencies

2001-01-15 Thread Jeffrey R. Lewis

Mark P Jones wrote:

 | I am finding functional dependencies confusing.  (I suspect I am
 | not alone.)  Should the following code work?
 |
 | class HasConverter a b | a - b where
 |convert :: a - b
 |
 | instance (HasConverter a b,Show b) = Show a where
 |show value = show (convert value)

 It's a separate issue.  There's no reason why a system using
 functional dependencies *should* support this.  But it is an
 attractive and useful extension that such a system would
 probably *want* to include.  (i.e., it's a desirable feature,
 not a requirement.)

 I typed your example into Hugs on my machine and it seemed to
 accept the syntax (which suggests that the implementation is
 intended to allow this kind of thing).  But then it choked on
 the definition with a curious error message that I suspect is
 an indication of a bug in Hugs' treatment of functional
 dependencies.  And, for that reason, I'm crossposting this
 message to hugs-bugs.  [Let me take the opportunity to remind
 good readers of these lists that it is now more than a year
 since I retired from the joys of maintaining Hugs, so I'm not
 planning to try and track down the source of this bug myself!]
 Of course, it could be that my version of Hugs is out of date!

Could you share the error message please.  Boy, even old maintainers don't know how to 
submit a bug report...  ;-)

Actually, I suspect whatever the problem was has been fixed, because (with -98 +o), it 
compiles fine. (I'm using the bleeding edge, out-of-the-repository version.)

--Jeff


___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: Problem with functional dependencies

2000-12-21 Thread Jeffrey R. Lewis

Simon Peyton-Jones wrote:

 I think you can simplify the example.  Given

 class HasFoo a b | a - b where
   foo :: a - b

 instance HasFoo Int Bool where ...

 Is this legal?

 f :: HasFoo Int b = Int - b
 f x = foo x

 You might think so, since
 HasFoo Int b = Int - b
 is a substitution instance of
 HasFoo a b = a - b

This is the step where the reasoning goes wrong.  The functional dependency tells you 
that `b' isn't really a free variable, since it is dependent on `a'.  If you 
substitute for `a', you can't expect `b' to remain unconstrained.

Hugs complains that the inferred type for `f' is not general enough.  It's right to 
complain, but the real problem is that the signature is too general.  Asimilar 
situation arises if you try to declare an instance `HasFoo Int b', but in this case, 
hugs complains that the instance is more general than the dependency allows.  A useful 
thing to do would be to check for this sort of thing in signatures as well, so that 
the more appropriate error message can be given.

--Jeff


___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: basicTypes/Var.lhs:194: Non-exhaustive patterns in function readMutTyVar

2000-11-28 Thread Jeffrey R. Lewis

Simon Peyton-Jones wrote:

 Thanks.  I've added both tests to the HEAD.
 Jeff: let us know the outcome re bug-hunt.

Attached is a much shorter demonstration of the first bug.  I don't have the HEAD 
checked out anywhere right now, so it isn't convenient to change it myself.

--Jeff

P.S.  the second bug isn't really a bug (in the sense that we need to hunt for it!) - 
the extension of functional dependencies over derived instances isn't implemeneted yet!


module Bug where

primDup :: Int - IO Int
primDup = undefined

dup () = call primDup

class Callc h | c - h where
call  :: c - h

instance Call c h = Call (Int-c) (Int-h) where call f = call . f



Re: no non-typevariable in instance declarations

2000-11-14 Thread Jeffrey R. Lewis

José Romildo Malaquias wrote:

 On Tue, Nov 14, 2000 at 05:02:30PM +, Malcolm Wallace wrote:
   class C a where
   ty :: a - String
   instance (Num a) = C a where
   ty _ = "NUM"
   instance C Integer where
   ty _ = "Integer"
 
   Why GHC and NHC98 are more restrictive than Hugs?
 
  The instances for (Num a= a) and Integer overlap, and are therefore
  forbidden by Haskell'98.

 But this is not relevant to my question. Removing the instance
 declaration

   instance C Integer where
   ty _ = "Integer"

 from the program (so that there is no instance overlapping now)
 does not help. Both GHC and NHC98 still complains with the
 same diagnostics as before. They are not accepting the
 instance declaration

   instance (Num a) = C a where
   ty _ = "NUM"

 because there is no non-type-variable component in the
 instantiated type "a" above.

 Again, why they have this restrictions while Hugs has not?

GHC doesn't have this restriction either, but since it's not Haskell 98,
you don't get it without some effort ;-).  The following combination of
flags will convince GHC to like your program:

-fallow-overlapping-instances -fallow-undecidable-instances

--Jeff


___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: Hugs and Linux

2000-11-10 Thread Jeffrey R. Lewis

[EMAIL PROTECTED] wrote:

 I can't get hugs98 to work under my linuxplatform. I have the Red
 Hat distirbution 7.0.
 The problem is that hugs requires a file called "readline.so.3" and
 I have "readline.so.4" on my system. Does anyone know how to get
 around this problem??

There will be a release of hugs sometime around the end of the year, I
believe, that will fix all of this.  In the meantime, you can grab an
interim version from the following URLs:

http://www.galconn.com/haskell/hugs98-Jul2000-1.i386.rpm
http://www.galconn.com/haskell/hugs98-Jul2000-1.src.rpm

These rpms were built on a redhat-7.0 system.

--Jeff


___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: Passing an environment around

2000-11-08 Thread Jeffrey R. Lewis

Fergus Henderson wrote:

 On 27-Oct-2000, José Romildo Malaquias [EMAIL PROTECTED] wrote:
  On Fri, Oct 27, 2000 at 09:07:24AM -0700, Jeffrey R. Lewis wrote:
   Yes, as implemented using the dictionary
   translation, implicit parameterization can lead to loss of sharing, exactly in
   the same way that overloading (and HOF in general) can lead to loss of sharing.
  
   However, I can imagine that a compiler might chose to implement implicit
   parameters more like dynamic variables in lisp.   Each implicit param essentially
   becomes a global variable, implemented as a stack of values - the top of the
   stack is the value currently in scope.  This would avoid the sharing problem
   nicely.
 
  I suppose your implementation of implicit parameterization in GHC and Hugs
  uses the dictionary translation, right?

 I believe so.

Sorry - that wasn't clear from my reply - yes, implicit parameters in GHC are compiled
using the dictionary translation.



  Would an alternative implementation
  based on a stack of values be viable

 Yes.

  and even done?

 Unlikely ;-)


That's a shame ;-)


 An alternative is to store the values of the implicit parameters in
 thread-local storage rather than global storage.  But this is more
 complicated.  It may also be less efficient on some targets (depending
 on how efficiently thread-local storage is implemented).

I don't know the costs associated w/ thread local storage - how is it likely to compare
w/ the dictionary passing implementation?

--Jeff


___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: False duplicate or overlapping instances message

2000-10-27 Thread Jeffrey R. Lewis

Simon Peyton-Jones wrote:

 | I think I've worked out what's going on now.  But I don't like it.
 | When I use -fallow-undecidable-instances and -fallow-overlapping-instances
 | (as I did) I was assuming (like Keith Wansbrough did) that
 | GHC would do a  Prolog-style backtracking search when it was time to
 resolve
 | an overloading, and would only complain if there were more or fewer than
 one
 | chain of inferences.  Instead Haskell eagerly tries to anticipate possible

 | conflicts, which is a nuisance  when it is obvious (as it is to me in this
 case) that such
 | conflicts are unlikely to arise.  For a simpler example, imagine that we
 have two classes
 | Integral a (things corresponding to integers) and String a
 | (things corresponding to strings).  It is a pity that we cannot write
 |
 | instance Integral a = Show a
 |and
 | instance String a = Show a
 |
 | just because someone may come along later on and try to show
 | something which is an instance of both Integral and String.  (Though
 obviously if
 | they do, we DO need an error message.)

 I agree with this.  Here's GHC's current policy:

 a) when trying to 'solve' a constraint (C T1 .. Tn) it finds an instance
 decl whose head matches, and commits to that.  If two match (and we're
 allowing
 overlap) then it picks the most specific.

 b) to make (a) make sense, GHC complains about overlap, at least
 where one head is not more specific than the other

 A better policy might be

 a') when trying to solve a constraint, search for all solutions, and check
 that in the end there is only one.

 b') never comlain of overlap when declaring instance declarations; instead
 only complain when solving constraints.

 This is no more expensive than (a) in the common no-overlap case,
 but it's more expressive when overlaps exist.

 All this makes perfect sense, I think.  I don't have time to implement
 it right now, but maybe someone else does?  It's a well-separated
 piece of GHC (one module).

This has already been implemented in hugs (option +m - for `multi-instance
resolution') as of about June 99.   See the hugs user manual for details.  It
worked pretty well at the time, but may have suffered some bit-rot - I just
haven't used that option in a long time, and since then there have been a
couple interesting changes to the type checker.

It was a fairly localized change in hugs, and I'd hope it would also be fairly
localized in GHC.

--Jeff


___
Glasgow-haskell-bugs mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/glasgow-haskell-bugs



Re: Overloaded function and implicit parameter passing

2000-10-27 Thread Jeffrey R. Lewis

José Romildo Malaquias wrote:

 Hi.

 While experimenting with the implicit parameter
 extension to Haskell 98, implemented in GHC 4.08.1
 and latest Hugs, I came accross a difference among
 those implementations regarding overloading functions
 with implicit parameters.

 As a test consider the program

 - cut here
 module Main where

 class C a where
 f :: (?env :: Integer) = a - Integer

 instance C Integer where
 f x = ?env + x

 main = putStrLn (show (f (45::Integer) with ?env = 100))
 - cut here

 Hugs accepts this program and outputs 145, as expected.
 But GHC 4.08.1 refuses to compile it, emitting the
 message

 $ ghc -fglasgow-exts Test1.hs -o test1

 Test1.hs:7:
 Unbound implicit parameter `env_rJX :: Integer'
 arising from use of `env_rJX' at Test1.hs:7
 In the first argument of `+', namely `env_rJX'
 In the right-hand side of an equation for `f': env_rJX + x

 Compilation had errors

 Would anybody comment on what is going on with GHC?

 I am willing to use implicit parameters in the
 software I am developing, but I have the need
 to overload functions with implicit parameters.
 While Hugs is good for development, its performance
 may rule it out when the final product is ready.
 So I will need a good Haskell compiler to compile
 my system.

 Any comments?

Certainly a bug.  I'll look at it when I get a chance.

--Jeff


___
Haskell mailing list
[EMAIL PROTECTED]
http://www.haskell.org/mailman/listinfo/haskell



Re: Results: poll: polymorphic let bindings in do

2000-06-06 Thread Jeffrey R. Lewis

John Launchbury wrote:

 Koen,

  If a language has the property that in one place, one can
  use a "let" block to define polymorphic bindings, and in
  another place one can only use it for monomorphic bindings,
  then I think that is bad language design.

 I don't think that's a very accurate description. The "let" in "do"
 is a bit of a different beast from the usual "let..in" construct.
 For a start, it doesn't have an "in" keyword. This is not a trivial
 point: "let"s are guaranteed to introduce polymorphism only
 within the block introduced by "in".

Isn't that a bit of a dodgy argument?  I don't know of any papers on `in'
polymorphism, but many about `let' polymorphism.  If I see `let', I expect
polymorphism, and I'm not going to go searching for an `in'.  (and yes, I
don't like the DMR, but that's another story).

It's been proposed before that we have a separate notation for monomorphic
bindings.  I'd like to see you adopt this approach, rather than having `let'
in `do' behave fundamentally differently.  This example also provides nice
example and motivation for introducing a monomorphic let notation.

 Uses of variables (functions)
 bound by a "let" block are a priori monomorphic within the rest of that
 block. In pactice, of course, the strongly-connected-components pass
 will reconfigure big let-blocks into many separate blocks, and so
 increase the amount of polymorphism. In a "do" block, however, it's
 a lot harder to do the reconfiguration, so we would be stuck with
 the by-default monomorphic behavior. (The current use of "let" in
 "do" does some simplistic reconfiguration -- introducing an "in"
 keyword to obtain polymorphism -- which does not work in the recursive
 case).

The ability to bind multiple things at once (and have the corresponding
increase in polymorphism) is a nice feature of a modern `let' - but it's not
the primary characteristic.  The let in do could easily bind multiple things
as well (by allowing `in').  The currently syntax is just an aesthetic choice
isn't it?

--Jeff





Re: Results: poll: polymorphic let bindings in do

2000-06-06 Thread Jeffrey R. Lewis

John Launchbury wrote:

 Jeff,

  Isn't that a bit of a dodgy argument?  I don't know of any papers on `in'
  polymorphism, but many about `let' polymorphism.  If I see `let', I expect
  polymorphism, and I'm not going to go searching for an `in'.

 Not true (or if true, misguided).

I say dodgy - you say misguided.  OK - a fair enough trade ;-)




  The ability to bind multiple things at once (and have the corresponding
  increase in polymorphism) is a nice feature of a modern `let' - but it's not
  the primary characteristic.  The let in do could easily bind multiple things
  as well (by allowing `in').  The currently syntax is just an aesthetic choice
  isn't it?

 The current "let" in "do" already can bind multiple things. The reason for omitting
 the "in" is explicitly to avoid introducing another scope level (requiring a new 
"do",
 a new indent, etc.). How one expects it to behave depends on the implicit
 assumptions made about all the variables being bound in the "do". Currently,
 even though "do" gives the appearance of introducing many variables all at the same
 declaration level, it actually intoduces a nested sequence of variable bindings,
 each possibly shadowing bindings earlier in the sequence. In this current setting
 it makes sense for "let" in "do" to be thought of having an implicit "in", and so
 its bound variable is polymorphic in the remainder of the "do". In a recursive "do"
 (as in a "let"), all the newly bound variables are actually considered to be in
 a concurrent scope -- as suggested by the layout. In this setting, any variables 
bound
 by "let" are also part of the same scope as the others -- we have one big binding
 group -- hence they are monomorphic when used by other declarations in the same
 binding group.

Aha - the source of confusion.  I hadn't made the leap that you were making do bindings
into one large binding group.  This is a potentially more significant change than just
the polymoprhism-of-let issue.

So the problem is how to divvy up the binding group to get useful polymorphism.  What 
if,
for the purposes of doing the dependency analysis, you treated sequential do bindings
(the - type, not let-in-do) as monolithic, then otherwise proceeded normally with the
strongly connected components analysis (taking care, if necessary, to keep do-binding
subgroups otherwise in order).   For current uses of do and let, this should preserve 
the
current polymorphic behaviour, since there won't be any backward dependecies and you 
can
just nest the bindings (unless someone uses shadowing, but then they're in trouble 
anyway
;-) .  OK, I'll just go and read the paper, since I'm sure you've already thought about
this...

--Jeff





negate and sections

2000-06-01 Thread Jeffrey R. Lewis

I just ran into this lovely little corner today, and wondering if anyone has
any nice suggestions for fixing it.

You can write the section (+ x) to specify a function to add `x' to
something.  That's great, then you need to specify a function for subtracting
`x' from something.   Great, you just type in: (- x), and you're done, right?

No so, of course.  (- x) means `negate x'.  Bummer.  What an unpleasant bit of
asymmetry!

Any good ideas for fixing this for Haskell-2?  SML gets around the whole
ambiguity of negation issue by using `-' for subtraction, and `~' for
negation.  Not really ideal, but I must say I have a new-found respect for
that solution.

--Jeff





Re: negate and sections

2000-06-01 Thread Jeffrey R. Lewis

Jan Skibinski wrote:

 On Thu, 1 Jun 2000, Jeffrey R. Lewis wrote:

  No so, of course.  (- x) means `negate x'.  Bummer.  What an unpleasant bit of
  asymmetry!

 How about ((-) x) ?

That, regrettably, is the wrong function.  That function is \y - x - y.  I wanted
\y - y - x.

--Jeff





Re: negate and sections

2000-06-01 Thread Jeffrey R. Lewis

Zhanyong Wan wrote:

 "Jeffrey R. Lewis" wrote:

  Jan Skibinski wrote:
 
   On Thu, 1 Jun 2000, Jeffrey R. Lewis wrote:
  
No so, of course.  (- x) means `negate x'.  Bummer.  What an unpleasant bit of
asymmetry!
  
   How about ((-) x) ?
 
  That, regrettably, is the wrong function.  That function is \y - x - y.  I wanted
  \y - y - x.

 How about (+ -x) ?

Ah.. OK, that's reasonable enough.  I retract my complaint as being too trivial... I
should know better than to grumble too early in the morning ;-)

BTW, `subtract' doesn't really help, since it's also asymmetrical - there's no
corresponding `add'.  I had a piece of code where one case was adding x, and the other
was subtracting it.  I was simply grumbling that I couldn't write the two cases in a
nice way symmetrically.

--Jeff





Re: Status of Functional Dependencies?

2000-04-06 Thread Jeffrey R. Lewis

Simon Peyton-Jones wrote:

 Jeff Lewis has heroically put in the code for 95% of functional
 dependencies,
 but I don't think he's quite finished.

 Jeff, what's  your plan?


I'd like to have some time to finish this up in the next month - hopefully
before the HIM.  The missing areas are mostly to do with properly propogating
dependencies thru generic instances and superclasses.

--Jeff






Re: overlaps and deduced contexts

2000-02-29 Thread Jeffrey R. Lewis

"S.D.Mechveliani" wrote:

 overlaps with the standard  instance Eq a = Eq [a] ...
 and causes, in the existing implementations, many error reports -
 until the user satisfies everything by typing

   f :: (Eq a,Eq [a],Eq [[a]],Eq [[[a]]],Eq a,
 Eq [a],Eq [[a]],Eq [[[a]]]
)
= Int - a - a - Int
   f = ...
 And here, naturally,  main = 93, unlike in the single instance case.

 As it was said, this is the simplest way to preserve the correct
 treating of overlapping instances.
 Note:
 * adding only  Eq [[[a]]]  would not suffice
 * if the extra instance shows n
   brackets, the forced type context has to contain about  (n^2)/2
   of them.
 Such a program looks rather awkward.

Oddly enough, this sort of thing just hasn't come up when I use overlapping
instances ;-)
Yes, that's nicely pathological, but is it a real problem?

 --
 Also there is a curious point in the implementors terminology.
 They consider the contexts like
(Eq a, Eq (a,a), Eq [a], Eq [[a]]) =

 as normal and regular for the *user* program!

The above signature would only be necessary if you had overlapping instances
on pairs *and* on lists *and* on lists of lists *and* that the function in
question used equality on all of those types.  That doesn't sound very normal
or regular ;-)  But seriously, such a program, unless contrived, would be
fairly non-trivial.  In that case, factoring all these things in, such a
context hardly seems such a problem, and I might even consider it good
documentation to the effect that there's a lot of overlapping going on here!


 And then, they discuss when and how such a context can be reduced by
 the compiler.

If such a context was necessary, because of overlapping instances, then the
context *would not* be reduced.  That's the point.  Overlapping instances can
prevent you from performing some context reductions.


 And according to the user's common sense,  Eq a =
 has to be considered as initial and regular, keeping in mind that
 there are *infinitely* many statements, like  Eq [[a]],  that can be
 derived logically and trivially from  Eq a,  according to the
 instances in scope.

But there can only be finitely many upon which you have an overlap.  Saying
that `Eq a' is initial and regular is fine, but it implies that you are going
to derive all other necessary instances from it - which directly contradicts
having overlapping instances.  You seem to want both.



 And then, the *implementor* (not the user) could think what the
 nasty reasons may force the compiler to extend explicitly the
 context with the deduced statements.

Let's take f again, as you've typed it:

f :: Eq a = a - Bool
f x = Just x == Just x

Given an overlapping instance decl for Eq at type Maybe Bool (and the standard
instance Eq = Eq (Maybe a)), the compiler complains that you've left out `Eq
(Maybe a)' from the context.  Why is this?  Is it because the compiler can't
figure out how to derive `Eq (Maybe a)' from `Eq a'?

No.  It is because it realizes that it *shouldn't* derive `Eq (Maybe a)' from
`Eq a', given the presence of the overlapping instance.  Would it make sense
at this point for the compiler to say, "heck, I don't know why he put `Eq a'
in the context, what he really needs is `Eq (Maybe a)'", and just correct the
programmer's error for him?  I'm afraid if you agree to that, then I'm afraid
you're also stuck with this:

f :: Eq a = a - a - Bool
f = ()

What should the compiler do now?  Should it say "heck, I don't know why he put
`Eq a' in the context, what he really needs is `Ord a'?  In that case, the
context becomes meaningless in a signature.

--Jeff




Re: overlapping instances

2000-02-24 Thread Jeffrey R. Lewis

"S.D.Mechveliani" wrote:


 The philosophy should be:
 --
 seeing in the program f ["foo","bar","baz"]
 the compiler judges that  f  applies to certain  xs :: [String].
 According to the compiled type of  f,
 the instances  Eq String,  Eq (Maybe String)
 are required. The instance  Eq String  is standard and unique.
 Then, the compiler chooses the most special definition for the  Eq
 instance for  Maybe  among those which are in the scope and which
 match the type  Maybe String.
 So, if the second  Eq  instance for  Maybe  occurs in the scope,
 then, clearly, it is applied.
 And how the scope forms?
 In Haskell-98, any mentioning of a module name M under `import'
 brings to the scope all the instances defined in M.
 Is this how the existing implementations treat the overlaps?

Essentially, yes (assuming the `second instance' is `Eq (Maybe String)').



 Technique:
 --
 the compiler has to solve, what dictionary value to substitute for
 Eq (Maybe a)  declared by the compiled  f.
 According to `philosophy', the choice may be between the above
 dictionaries values from
eqMaybe   :: Eq a - Eq (Maybe a)
 or eqMaybeString :: Eq (Maybe String).
 The second is chosen as the most special.
 What does this mean "chosen" ?
 The  *compiled  f  has two extra arguments* :
 for the Eq class dictionary values for `a' (corresponds to the `eq'
 local variable) and for `Maybe a'  (`eqMb').
 Now,
   f ["foo","bar","baz"]
 compiles to   f_compiled eqChar eqMaybeString ["foo","bar","baz"].

 Hence, ignoring so far various optimization possibilities, we see
 that  f  is compiled *once*. But each its application requires the
 appropriate dictionary value for the additional argument.
 The overlapping instances cause a non-trivial choice for this
 additional argument - which is resolved quite naturally.
 Is this how  GHC, Hugs  process the overlaps?

GHC and hugs resolve overlaps essentially as you indicated in your
`philosophy' above: you line up all the in-scope instances, and select the
most specific one that matches.  With in-scope instance set:
Eq String
Eq (Maybe String)
Eq a = Eq (Maybe a)
If we're requested to find an instance for `Eq (Maybe String)', the choice is
easy - the second instance wins.

But what if we're requested to find an instance for `Eq (Maybe a)'?  Well,
obviously, we can't chose the second instance because it's *too* specific.
But the third instance looks like a fine choice.  Unfortunately, if we chose
the third instance, then we preclude the possibility of, at a later point,
chosing the more specific second instance.  Recognizing this, the request is
denied, because no choice of instance is appropriate.

The gotcha with this arrangement is the bit about `in-scope instances'.  Since
the set of in-scope instances depends on how one happens to set up the import
list, the meaning of the program can depend on the import list.  In general,
this is a bad thing.  How to resolve this?  I see two approaches: either force
the *compiler* to figure out a way to not depend on the scoping of instances
when resolving overlapping instances, or force the *user* to gets his/her
imports right, so that overlapping is always resolved consistently.  I haven't
explored the first option, but it doesn't look promising.  The second option,
however, I think I know how to do, as I sketched in an earlier post, but it
requires a bit more bookkeeping on the compiler's part.  But I haven't looked
at this idea in detail.



  It's not enough for the compiler to just remember after compiling
  f that it needs an explicit dictionary of Eq (Maybe a) passed, and
  generate different calls to f than to other functions of the same type,
  because a function like
 g :: Eq a = ([a] - [a]) - [[a]] - Int
  does not know what kind of f will receive (the one that depends
  on Eq (Maybe a), or possibly Eq (a,Bool) etc.) - it is determined
  at runtime.

 Here, I am about stuck.

The higher-order case isn't any more interesting than the first-order case.
The designers of the type class system were clever enough to avoid this snafu
;-)  Instance resolution is part of static type inference,  it is not
determined at runtime.

In particular, it is not the responsibility of `g' to pass dictionaries to any
parameter `f' - this would be handled at the point where `g' is applied to
`f'.  Parameters, whether higher-order or not, are always monomorphic, and
non-overloaded.  Only `let'/`where' bound idents, or top-level bound idents
can be overloaded.  If we want to think in terms of the dictionary
translation, in the translation of `g f', `f' would be partially applied to
its dictionaries, and then passed to `g'.

g f--   \d - g (f d)

(assuming that `g' itself wasn't overloaded, and that `f' was only overloaded
on one thing).

As for your later question about module `E' - it would be helpful if you'd
provide the 

Re: overlapping instances

2000-02-24 Thread Jeffrey R. Lewis

Marcin 'Qrczak' Kowalczyk wrote:

 The other issue is efficiency if you want f to behave the same way even
 when the instance Eq (Maybe String) is not visible at f's definition.
 It would mean that every overloaded function must be extended to
 directly receive all dictionaries it needs. This can give not only
 very many, but even an infinite number of dictionaries needed (with
 polymorphic recursion). I cannot imagine a sensible implementation
 of the function:
 h :: Eq a = a - a - Int
 h x y = if x == y then 0 else 1 + h [x] [y]
 which would make h "1" "2" return 7 if in some other module there is:
 instance Eq [[[String]]] where
 x == y = length x == length y


The example with polymorphic recursion is a nice example.  Now, if the `instance Eq
[[[String]]]' *were* in scope, I can imagine how to implement this.  One way would be 
to
unfold the recursion enough steps to accomodate the overlap.  It wouldn't be pretty, 
but
how often does this come up ;-)

Especially given the above example, I don't think that trying to make overlapping 
behave
consistently, regardless of instance scope, is the right approach.  I think it would be
fine if the compiler complained if my imports lead to inconsistent instance scopes.


  Let us put some concrete example. Suppose the module  G  contains
g :: Eq a = ([a] - [a]) - [[a]] - Int
gh   (xs:_) =
  let ... something complex
  l = length $ h xs
   
  ys = ...something complex
  in  l+(length ys)
  g  is compiled straightforward.

 So it passes just the Eq a dictionary to h.

Hmm  `h' here is not overloaded - no parameter of a function can be.  Any 
dictionary
passing necessary would happend where `g' was applied to something.  See previous note.


   Unless one accepts that subtle differences in contexts, ones
   depending on the implementation rather than the interface, change
   the meaning. And that definition like "g :: some signature; g = f"
   can cause g to be defined on exactly the same types as f, but with
   a different meaning. And that the same polymorphic function used on
   the same type has different meanings in various places.
 
  First, could you provide an example?
  Maybe, the one constructed from above `f' ...

 It depends on what rules do we have. For example if f was required
 to have the type signature:
 f :: (Eq a, Eq (Maybe a)) = [a] - [a]
 to see the more specific instances, nevertheless:
 g :: Eq a = [a] - [a]
 g = f
 was allowed, then this is an example. It's bad because it's easier to
 spot an error when it is catched at compile time (this is not allowed)
 than when it leads to a change in behavior (something is allowed but
 this is allowed in a different way and the compiler has picked one of
 them).

Urm... in any consistent set of rules, `g' above would result in a type error.  If `f'
(which, BTW, doesn't need the `Eq a' in its signature) is required to have `Eq (Maybe
a)' in its context, then `g' will as well.  This is independent of the instance scoping
issue, because in order to even see `f', `g' would also see the overlap which forced 
`f'
to have that context.

--Jeff




Re: overlapping instances

2000-02-24 Thread Jeffrey R. Lewis

"S.D.Mechveliani" wrote:

  That is, f receives a dictionary of Eq methods on the type a, as
  specified in its type. It builds a dictionary of Eq methods on the
  type Maybe a itself, but the fact that it uses instance Eq (Maybe a)
  is not visible outside.

 No. Probably, here how it should be.
 Seeing   `Just x1 == Just x2'
 the compiler extends *silently* the context for  f:


You are essentially suggesting that contexts should be optional when giving a
signature to a function.  This is an interesting idea, but essentially
orthogonal to overlapping instances.  I.e., making signature contexts optional
would in this case make overlapping instances more convenient, and that's all.

I'm not willing to have this be the default behavior for Haskell, but others
have already proposed that the programmer be allowed to use a wildcard
(perhaps `...') in signature contexts to mean `and whatever else is
inferred'.  If you didn't want to be bothered to figure out the correct
context, you'd just write:
f :: ... = [a] - [a]

I'll respond to other bits of this long note separately.

--Jeff





Re: overlapping instances

2000-02-17 Thread Jeffrey R. Lewis

Fergus Henderson wrote:

 On 16-Feb-2000, Jeffrey R. Lewis [EMAIL PROTECTED] wrote:
  To my mind, the biggest flaw with overlapping instances is the separate
  compilation issue: to whit, if the `instance Eq (Maybe String)' was in
  a different module, not imported by the module defining `f', then
  Marcin's definition of `f' (with context `Maybe a') would be fine, and
  would not `see' the overlap.   But in a way, this is arguably correct:
  since the designer of `f' didn't forsee the overlap, it shouldn't
  affect the meaning of the function he defined.

 If Haskell had explicit imports and exports of instance declarations,
 then I could perhaps buy this argument.  But it doesn't.

Well, I'm not sure I buy my argument either

So, let's assume that overlaps should always apply consistently across modules,
regardless of the visibility of the overlap, so that the programmer doesn't get
surprises by changing an import list somewhere.  I think that it is possible to
enforce this, with some extra bookkeeping.  The compiler would need to keep
track of what context reductions were applied in each module.  Then, if any
program loaded two modules with inconsistent context reductions, a `link error'
would occur (or maybe just a warning).  I'd be more concrete, except that it is
late ;-)

--Jeff




Re: overlapping instances

2000-02-08 Thread Jeffrey R. Lewis

"Carl R. Witty" wrote:

 "Jeffrey R. Lewis" [EMAIL PROTECTED] writes:

  Marcin 'Qrczak' Kowalczyk wrote:
   Parts of context reduction must be deferred, contexts must be left
   more complex, which as I understand leads to worse code - only to
   make overlapping instances behave consistently, even where they are
   not actually used.
 ...
  When you say:  `even where they are not actually used', I'm not sure what you
  mean.  The deferred reduction only happens on classes with overlap.  Classes
  without overlap will be `eagerly' reduced.

 How can that work?  Given separate compilation, you don't know whether
 a class has overlapping instances or not, do you?

As with context reduction in general, it is always done with respect to what
instances are in scope.  Let's make it concrete:

module T where

class Ick a where ick :: a
instance Ick Int where ick = 17
instance Ick a = Ick [a] where ick = [ick]

module S where
import T

instance Ick [Int] where ick = [13]

If you've only got module T loaded, then ick :: [Int] has value [17].  If you've
got module S loaded, then ick :: [Int] has value [13].

This is why overlapping instances aren't in standard Haskell ;-)  But if you turn
on +o in hugs or -fallow-overlapping-instances in GHC, then you're saying that you
understand this (he asserts boldly...).  But this is manageable by making sure you
load your modules in the right order (and the compiler *could* issue a warning if
you didn't - if it was deemed worthy of the additional bookkeeping).  The problem
that I was describing as being recently fixed, caused overlapping instances to
behave inconsistently even within the same scope of instances.

--Jeff




Re: overlapping instances

2000-02-07 Thread Jeffrey R. Lewis

Marcin 'Qrczak' Kowalczyk wrote:

 Sun, 06 Feb 2000 23:21:38 -0800, Jeffrey R. Lewis [EMAIL PROTECTED] pisze:

  If context reduction choses a more generic instance when a more
  specific one exists, then I consider that a bug.

 http://research.microsoft.com/users/simonpj/Papers/multi.ps.gz
 Section 4.4

 Parts of context reduction must be deferred, contexts must be left
 more complex, which as I understand leads to worse code - only to
 make overlapping instances behave consistently, even where they are
 not actually used.

Parts of context reduction must be deferred, and contexts may be left more
complex.  Yes.  Overlapping instances come with that price.  But the price isn't
really that high - and it's just the usual tradeoff of flexibility verses
`optimality'.

When you say:  `even where they are not actually used', I'm not sure what you
mean.  The deferred reduction only happens on classes with overlap.  Classes
without overlap will be `eagerly' reduced.

--Jeff




Re: bug in ghc-4.06 ?

2000-02-04 Thread Jeffrey R. Lewis

"S.D.Mechveliani" wrote:

 Dear GHC,

 I fear, there is some hard bug in  ghc-4.06.
 On the program

   main = let  p= 5 :: Integer
   iI   = eucIdeal "be" p [] [] [(p,1)]
   r1   = Rse 1 iI dZ
   dK   = upGCDRing r1 eFM
  --upRing
   setK = snd $ baseSet r1 dK
  in   putStr $ shows (osetCard setK) "\n"

 ghc-4.04 behaves correct,
 while  ghc-4.06  prints  "UnknownV"  instead of  "Fin 5".

 upGCDRing r1 eFM   applies   upRing r1 eFMand then adds to the
 result certain things.
 Replacing  upGCDRing  with  upRing  in the above `main' has to
 preserve the result. And  ghc-4.06  does not preserve it.

 If you want to debug this, tell me to what address to send this bug
 project - 100-300 Kbyte.

I'll pass, but I suspect I know what the bug is.

Consider this little program:


 class C awhere c :: a
 class C a = D a where d :: a

 instance C Int where c = 17
 instance D Int where d = 13

 instance C a = C [a] where c = [c]
 instance ({- C [a], -} D a) = D [a] where d = c

 instance C [Int] where c = [37]

 main = print (d :: [Int])

What do you think `main' prints  (assuming we have overlapping instances, and
all that turned on)?  Well, the instance for `D' at type `[a]' is defined to
be `c' at the same type, and we've got an instance of `C' at `[Int]', so the
answer is `[37]', right? (the generic `C [a]' instance shouldn't apply because
the `C [Int]' instance is more specific).

Ghc-4.04 gives `[37]', while ghc-4.06 gives `[17]', so 4.06 is wrong.  That
was easy ;-)  Let's just consult hugs for good measure.  Wait - if I use old
hugs (pre-September99), I get `[17]', and stranger yet, if I use hugs98, it
doesn't even compile!  What's going on!?

What hugs complains about is the `D [a]' instance decl.


 ERROR "mj.hs" (line 10): Cannot build superclass instance
 *** Instance: D [a]
 *** Context supplied: D a
 *** Required superclass : C [a]

You might wonder what hugs is complaining about.  It's saying that you need to
add `C [a]' to the context of the `D [a]' instance (as appears in comments).
But there's that `C [a]' instance decl one line above that says that I can
reduce the need for a `C [a]' instance to the need for a `C a' instance, and
in this case, I already have the necessary `C a' instance (since we have `D a'
explicitly in the context, and `C' is a superclass of `D').

Unfortunately, the above reasoning indicates a premature commitment to the
generic `C [a]' instance.  I.e., it prematurely rules out the more specific
instance `C [Int]'.  This is the mistake that ghc-4.06 makes.  The fix is to
add the context that hugs suggests (uncomment the `C [a]'), effectively
deferring the decision about which instance to use.

Now, interestingly enough, 4.04 has this same bug, but it's covered up in this
case by a little known `optimization' that was disabled in 4.06.  Ghc-4.04
silently inserts any missing superclass context into an instance declaration.
In this case, it silently inserts the `C [a]', and everything happens to work
out.

So, what's the fix?  I think hugs has it right (of course I do ;-).  Here's
why.  Let's try something else out with ghc-4.04.  Let's add the following
line:

d' :: D a = [a]
d' = c

Everyone raise their hand who thinks that `d :: [Int]' should give a different
answer from `d' :: [Int]'.  Well, in ghc-4.04, it does.  The `optimization'
only applies to instance decls, not to regular bindings, giving inconsistent
behavior.

What hugs does is this: like GHC, the list of instances for a given class is
ordered, so that more specific instances come before more generic ones.  For
example, the list might contain:
..., C Int, ..., C a, ...
When we go to look for a `C Int' instance we'll get that one first.  But what
if we go looking for a `C b' (`b' is unconstrained)?  We'll pass the `C Int'
instance, and keep going.  But if `b' is unconstrained, then we don't know yet
if the more specific instance will eventually apply.  GHC keeps going, and
matches on the generic `C a'.  Hugs, on the other hand, at each step, checks
to see if there's a reverse match, and if so, aborts the search.  This
prevents hugs from prematurely chosing a generic instance when a more specific
one may apply at some later point.

If y'all agree that GHC should match hugs on this, it's only about a 4 line
fix - I've tried it out already.  On the other hand, I don't think that this
will make Sergey a happy camper.  Many instance declarations need to be
tweaked.  It's a tedious job, but straightforward.

--Jeff




Re: bug in ghc-4.06 ?

2000-02-04 Thread Jeffrey R. Lewis

Simon Peyton-Jones wrote:

 | If y'all agree that GHC should match hugs on this, it's only
 | about a 4 line
 | fix - I've tried it out already.  On the other hand, I don't
 | think that this
 | will make Sergey a happy camper.  Many instance declarations
 | need to be
 | tweaked.  It's a tedious job, but straightforward.

 I'd buy that, Jeff.  Unless anyone disagrees, why don't you go ahead
 and make the change?  It may require tweaking of not only instance
 decls but also ordinary decls with type signatures, right?  But only
 for people who use overlapping instance decls.

Yes, right on both counts.  And the compiler is kind enough to tell you
exactly what you need to add.



 When you make the fix, could you put the entire text of your message
 in as a comment?  (Edit it if you like, of course, but don't remove
 the examples.  I have slowly come to realise that whenever one makes
 a 2-line fix motivated by a strange case one should put the strange
 case in as a comment. I have often reversed such changes two years later
 because I though they were bogus, only to rediscover the strange case...)

Done.

--Jeff




Re: Functional dependencies

2000-01-30 Thread Jeffrey R. Lewis

Marcin 'Qrczak' Kowalczyk wrote:

 Thank you for ghc-4.06!

 The following code is accepted by Hugs, but ghc complains about type
 variable r being not in scope. Adding "forall m r." causes the error
 "each forall'd type variable mentioned by the constraint must appear
 after the =", where Hugs still accepts it. If I understand functional
 dependencies, they should be legal. With a concrete monad fundeps work.

 class Reference r m | m - r where
 new :: a - m (r a)
 put :: r a - a - m ()
 get :: r a - m a

 test:: (Monad m, Reference r m) = m Int
 test = do
 x - new 42
 get x

Functional dependencies fall under the category of "Not-quite-ready-yet, but
in there nontheless".  But thanks for the report.  Functional dependencies and
implicit params won't be fully supported 'till a later release (hopefully the
next one).

--Jeff




Re: FFI.lhs

2000-01-28 Thread Jeffrey R. Lewis

Sven Panne wrote:

 "Jeffrey R. Lewis" wrote:
  Currently (out of CVS), a compile of `hslib/lang' fails on FFI.lhs:
  FFI.lhs:119: Data constructor not in scope: `SIZEOF_CHAR'
  FFI.lhs:120: Data constructor not in scope: `ALIGNMENT_CHAR'
  [...]

 Some parts of the configuration mechanism changed. Did you invoke
 autoheader and autoconf before configure?

This fixes the problem.

--Jeff




CVS repository

1999-12-16 Thread Jeffrey R. Lewis

Greetings:

The Hugs/GHC CVS server (cvs.haskell.org) has just been moved to a
new machine.  Everything should be the same, except that  ssh may give a
dire warning message that starts off like this:

@@@
@   WARNING: HOST IDENTIFICATION HAS CHANGED! @
@@@
IT IS POSSIBLE THAT SOMEONE IS DOING SOMETHING NASTY!

Deleting the entry for cvs.haskell.org (or glass.cse.ogi.edu) from your
.ssh/known_hosts file should make ssh be more calm.

--Jeff




Re: Dynamic scopes in Haskell

1999-12-01 Thread Jeffrey R. Lewis

"Ch. A. Herrmann" wrote:

 I had just a fast look at the following I found at the
 page "http://www.cse.ogi.edu/PacSoft/projects/Hugs/hugsman/exts.html"
 for dynamic scoping:

min :: [a] - a
min  = least with ?cmp = (=)

 Actually, I'm not sure how referential transparency can be established
 with these implicit parameters. Assume min to be called at two places
 with a different value for cmp, but with the same input list. Or is it
 the case that the type a is bound to a particular cmp all over the program?

 Please note, that referential transparency is one main advantage Haskell
 has in contrast to other languages .

A paper on implicit parameters will appear in POPL'00 that will answer this
sort of question.  I'll post a reference to it later when I get a chance to
put it on my web page.

But in short, for `min' to be called in two places, within the same lexical
scope, with different bindings for `?cmp' implies that there's an intervening
`with' binding in that same lexical scope.  To avoid the problem you allude
to, we extended substitution, particularly thru `with' bindings, to preserve
dynaming scoping (i.e. to prevent dynamic capture).  Thus referential
transparency is retained, you just have to be careful when you substitute -
just as you have to be careful with regular static name capture.

--Jeff




Re: Dynamic scopes in Haskell

1999-12-01 Thread Jeffrey R. Lewis

Simon Peyton-Jones wrote:

 | - Does other Haskell implementations (ghc, nhc, hbc, ...)
 |   would provide this extension in next releases? (This way,
 |   even been an extension, my system would be portable)

 Jeff Lewis is well advanced with adding functional dependencies
 into GHC; I believe that he plans then to add dynamic scopes, which
 are elegantly expressed in terms of functional dependencies.


I'll quibble a little: implicit parameters can be implemented elegantly
by borrowing some of the mechanism of functional dependencies.  Thus,
I'm implementing FDs in GHC first, then adding implicit parameters.  But
it's not the case that once you have functional dependencies, you've got
implicit parameters for free.

--Jeff




Re: FW: Compiling GHC

1999-08-22 Thread Jeffrey R. Lewis

Mark Utting wrote:

 Simon wrote:
  Can anyone help with this?  Simon and Sigbjorn are both
  on holiday, and I am wonderfully ignorant about such things.

  John McCarten wrote:
  I recently emailed you concerning the installation of GHC,
  I have now managed to install and configure to some degree the
  system, however it 'compiles' a haskell script but fails when
  trying to import the library gmp, giving the message:...

 [I don't know if this will be much help, but...]
 I've had a lot of trouble trying to install GHC 4.04 under
 Linux too, both from the binary distribution and the source one.
 This gmp problem was fairly easy to fix, I just hunted
 around the (source) directory tree and found that gmp is actually
 included in the GHC distribution and that libgmp.a had been built,
 but was just not in the right path for the linker to find it.
 Copying that .a file into the directory where ghc was being linked
 fixed the problem.  So this looks like a problem with the Makefiles.


If you are using a redhat distribution, the solution is even easier:  install
the rpm `gmp' (you might also need `gmp-devel').  Both of these are in the
standard 6.0 distribution.


 But next I found that when you try to link the standard hello world
 Main.hs, the linker complains about a missing
 'data_start' that is called from the GHC libraries.  The only
 "data_start" we can find is an unsigned long in an Alpha/OSF a.out
 header file, but it is not obvious how/why it is in the ghc library
 for an i386 linux distribution.
 We conjecture that the distribution was either cross-compiled on an Alpha,
 or something else happened which mixed some Alpha stuff into the
 Linux binary distribution?


Hmm... FWIW, (on a redhat 6.0 system), I had zero problems either using the
binary dist, or compiling from source (I built using the 4.04 binary - sorry,
not much help, I guess!).  Checking my hsc binary, I have both symbols
`__data_start' and `data_start' defined.

--Jeff



Re: Partial Type Declarations

1999-01-16 Thread Jeffrey R. Lewis

Claus Reinke wrote:


 what I wanted to write was
 a partial specification, which I would want to write as

   foo := C a = a - b

 Read e := t as "the type of e should be bounded by t". Choose a better
 symbol if you like -- the idea is to keep the syntax of types unchanged,
 and to introduce a language-level symbol for the existing partial
 ordering of constrained type schemes instead.

 Wouldn't this be simpler than introducing new concepts such as type
 "skeletons" or "metaquantifications"?


I really like the spirit of Claus' suggestion.  But it isn't as flexible as
the previous ones involving `metaquantifications'.  With these
`metaquantifications', you can construct tighter bounds.  In other words,
Claus' suggestion is roughly equivalent to making all type variables be
`meta' (the right-hand side below doesn't express that the context may have
more elements in it):

foo := C a = a - b  roughly equiv to foo :: C _a = _a - _b

I can easily imagine that you might want some variables to be a bound, and
others to be exact, as in

foo :: C a = a - _b

I don't think the above can be expressed with Claus' proposal.

--Jeff




Re: Partial Type Declarations

1999-01-16 Thread Jeffrey R. Lewis

Wolfram Kahl wrote:

 Jeffrey R. Lewis" [EMAIL PROTECTED] writes:
  
   foo := C a = a - b  roughly equiv to foo :: C _a = _a - _b
  
   I can easily imagine that you might want some variables to be a bound, and
   others to be exact, as in
  
   foo :: C a = a - _b
  
   I don't think the above can be expressed with Claus' proposal.
  

 You could, by being more explicit than usual:

  foo := forall b = C a = a - b

Glad to be shown wrong ;-)  This is quite reasonable.  With ::, the default for a
type variable is an implicit `forall' on the outside.  With :=, the default for
a type variable is that it is a bound (a `metavariable' in earlier posts).  A
side comment.  What Claus' post made clear to me is that we're not talking about
any `new' concepts here (a `metavariable' is just the usual notion of an
unquantified type variable), we're really just looking for a nice notation for
old concepts that our language doens't allow us to write down.

Anyway, the only thing missing now in the above proposal is a similar flexibility
with contexts.  Say, you want `b' to be a bound, and thus use :=, but you want
the context to be exact (i.e. you don't want extra context elements to be
allowed).  I can't think of any particularly compelling notation for "the context
is `C a' and no more".  In this case, the combination of allowing `...' to extend
contexts, and _a (or similar annotation) for unquantified type variables seems
more natural.

--Jeff




Re: H/Direct 0.16

1999-01-03 Thread Jeffrey R. Lewis

"Sigbjorne Finne (Intl Vendor)" wrote:

 Hi,

 if you compile the contents of lib/ with FOR_HUGS
 set to YES, you shouldn't run into either of these, e.g.,

   sof$ make FOR_HUGS=YES AddrBits.hs
   ../src/ihc -fno-qualified-names  --hugs  -fno-imports\
  -fint-is-int  -c AddrBits.idl -o   AddrBits.hs
   sof$ grep HDirect AddrBits.hs
   sof$

 That said, there are no Makefile rules for building
 the dynamic libraries containing the Hugs primitives
 other than in the Win32 case. If someone could
 contribute these, I'd be more than happy to
 integrate them ;)

For what it's worth, here are the patches that I applied to a recent
(couple of month old, out of CVS) version of H/Direct to get it to build
on linux.  I haven't tried these one the current release.

--Jeff


*** fptools.orig/hdirect/lib/Makefile   Fri Jul 23 03:55:05 1999
--- fptools/hdirect/lib/MakefileMon Aug  2 10:35:10 1999
***
*** 5,10 
--- 5,11 
  
  # 
  FOR_HUGS = NO
+ dll = so
  
  all ::
  
***
*** 68,75 
--- 69,81 
  HS_SRCS  += $(COMLIB_HS_SRCS)
  endif
  
+ ifeq "$(BUILD_COMLIBS)" "NO"
+ IDL_SRCS = $(filter-out AutoPrim.idl ComPrim.idl Connection.idl SafeArray.idl 
+TypeLib.idl WideString.idl, $(wildcard *.idl))
+ else
  IDL_SRCS = $(wildcard *.idl)
+ endif
  #IDL_SRCS = $(filter-out TypeLib.idl, $(wildcard *.idl))
+ IDL_DLLS = $(filter-out StdTypes.idl, $(IDL_SRCS))
  
  #
  # Only support building of server-modules under mingw32 at the moment,
***
*** 125,131 
  
  INSTALL_DATAS = $(HS_IFACES) $(SRCS)
  
! OBJS += $(patsubst %.idl, %.o, $(IDL_SRCS))
  
  LIBCOM_OBJS += $(filter-out PointerSrc.o AddrBits.o, $(OBJS)) PointerSrcCom.o 
  LIBHD_OBJS   = HDirect.o PointerSrc.o PointerPrim.o Pointer.o
--- 131,137 
  
  INSTALL_DATAS = $(HS_IFACES) $(SRCS)
  
! OBJS += $(patsubst %.idl, %.o, $(IDL_DLLS))
  
  LIBCOM_OBJS += $(filter-out PointerSrc.o AddrBits.o, $(OBJS)) PointerSrcCom.o 
  LIBHD_OBJS   = HDirect.o PointerSrc.o PointerPrim.o Pointer.o
***
*** 141,147 
  endif
  
  ifeq "$(FOR_HUGS)" "YES"
! all :: $(patsubst %.idl, %.hs, $(IDL_SRCS))
  
  all :: dlls
  
--- 147,153 
  endif
  
  ifeq "$(FOR_HUGS)" "YES"
! all :: $(patsubst %.idl, %.hhs, $(IDL_SRCS))
  
  all :: dlls
  
***
*** 156,169 
  %.hi : %.lhs
@:
  
  AddrBits.c : AddrBits.hs
  PointerPrim.c : PointerPrim.hs
  
! AddrBits.dll : AddrBits.dll_o
! AddrBits.dll : AddrBitsPrim.dll_o
  WideString.c : WideString.hs
  
! C_STUBS = $(patsubst %.idl, %.c, $(IDL_SRCS))
  
  HS_SRCS:=
  
--- 162,178 
  %.hi : %.lhs
@:
  
+ %.hhs : %.hs
+   mv $ $@
+ 
  AddrBits.c : AddrBits.hs
  PointerPrim.c : PointerPrim.hs
  
! AddrBits.${dll} : AddrBits.dll_o
! AddrBits.${dll} : AddrBitsPrim.dll_o
  WideString.c : WideString.hs
  
! C_STUBS = $(patsubst %.idl, %.c, $(IDL_DLLS))
  
  HS_SRCS:=
  
***
*** 175,187 
@$(RM) $@
$(CC) $(CC_OPTS) -DCOM -c $ -o $@
  
! AddrBits.dll: AddrBits.dll_o AddrBitsPrim.dll_o
! AutoPrim.dll: AutoPrimSrc.dll_o AutoPrim.dll_o PointerSrcCom.dll_o 
ComPrimSrc.dll_o
! ComPrim.dll : ComPrim.dll_o ComPrimSrc.dll_o PointerSrcCom.dll_o
! PointerPrim.dll : PointerPrim.dll_o PointerSrcCom.dll_o
! SafeArray.dll   : SafeArray.dll_o
! StdTypes.dll: StdTypes.dll_o
! WideString.dll  : WideString.dll_o WideStringSrc.dll_o
  
  .PRECIOUS: %.dll_o
  
--- 184,206 
@$(RM) $@
$(CC) $(CC_OPTS) -DCOM -c $ -o $@
  
! AddrBits.${dll}: AddrBits.dll_o AddrBitsPrim.dll_o
! AutoPrim.${dll}: AutoPrimSrc.dll_o AutoPrim.dll_o
! ifeq "$(BUILD_COMLIBS)" "NO"
! AutoPrim.${dll}: PointerSrc.dll_o
! else
! AutoPrim.${dll}: PointerSrcCom.dll_o ComPrimSrc.dll_o
! endif
! ComPrim.${dll} : ComPrim.dll_o ComPrimSrc.dll_o PointerSrcCom.dll_o
! PointerPrim.${dll} : PointerPrim.dll_o
! ifeq "$(BUILD_COMLIBS)" "NO"
! PointerPrim.${dll} : PointerSrc.dll_o
! else
! PointerPrim.${dll} : PointerSrcCom.dll_o
! endif
! SafeArray.${dll}   : SafeArray.dll_o
! StdTypes.${dll}: StdTypes.dll_o
! WideString.${dll}  : WideString.dll_o WideStringSrc.dll_o
  
  .PRECIOUS: %.dll_o
  
***
*** 189,198 
@echo EXPORTS  $@
@echo initModule  $@
  
! %.dll : %.dll_o
$(CCDLL) $(CCDLL_OPTS) -o $@ $^ $(CCDLL_LIBS)
  
! dlls :: $(patsubst %.idl, %.dll, $(IDL_SRCS))
  
  endif
  # End of Hugs specific bit
--- 208,217 
@echo EXPORTS  $@
@echo initModule  $@
  
! %.${dll} : %.dll_o
$(CCDLL) $(CCDLL_OPTS) -o $@ $^ $(CCDLL_LIBS)
  
! dlls :: $(patsubst %.idl, %.${dll}, $(IDL_DLLS))
  
  endif
  # End of Hugs specific bit



hugs patches

1998-12-08 Thread Jeffrey R. Lewis

The following patches are for ghc/interpreter, and fix some problems due
to the recent change in Weak/Foreign stuff.

--Jeff

In ghc/includes:

*** Assembler.h 1998/12/07 21:33:20 1.1
--- Assembler.h 1998/12/07 21:33:47
***
*** 118,123 
--- 118,124 
PTR_REP  = 'P',
ALPHA_REP= 'a',  /* a*/
BETA_REP = 'b',  /* b  */
+   GAMMA_REP= 'c',  /* c  */
BOOL_REP = 'B',  /* Bool   */
IO_REP   = 'i',  /* IO a   */
HANDLER_REP  = 'H',  /* Exception - IO a  */

In ghc/interpreter:

===
RCS file: RCS/type.c,v
retrieving revision 1.1
diff -c -r1.1 type.c
*** type.c  1998/12/08 17:28:42 1.1
--- type.c  1998/12/08 17:30:23
***
*** 2347,2352 
--- 2347,2353 
  static Type stateVar = NIL;
  static Type alphaVar = NIL;
  static Type betaVar  = NIL;
+ static Type gammaVar = NIL;
  static Int  nextVar  = 0;

  static Void clearTyVars( void )
***
*** 2381,2386 
--- 2382,2395 
  return betaVar;
  }

+ static Type mkGammaVar( void )
+ {
+ if (isNull(gammaVar)) {
+ gammaVar = mkOffset(nextVar++);
+ }
+ return gammaVar;
+ }
+
  static Type local basicType(k)
  Char k; {
  switch (k) {
***
*** 2445,2450 
--- 2454,2461 
  return mkAlphaVar();  /* polymorphic */
  case BETA_REP:
  return mkBetaVar();   /* polymorphic */
+ case GAMMA_REP:
+ return mkGammaVar();  /* polymorphic */
  default:
  printf("Kind: '%c'\n",k);
  internal("basicType");

In ghc/lib/std:

*** PrelHandle.lhs  1998/12/07 22:11:22 1.1
--- PrelHandle.lhs  1998/12/08 17:45:51
***
*** 47,58 
  #define CCALL(fun) _ccall_ fun
  #define const_BUFSIZ ``BUFSIZ''
  #define primPackString
  #ifndef __PARALLEL_HASKELL__
  #define FILE_OBJECT   ForeignObj
  #else
  #define FILE_OBJECT   Addr
  #endif
- #endif

  \end{code}

--- 47,59 
  #define CCALL(fun) _ccall_ fun
  #define const_BUFSIZ ``BUFSIZ''
  #define primPackString
+ #endif
+
  #ifndef __PARALLEL_HASKELL__
  #define FILE_OBJECT   ForeignObj
  #else
  #define FILE_OBJECT   Addr
  #endif

  \end{code}

***
*** 151,158 
  freeFileObject :: ForeignObj - IO ()
  freeFileObject fo = CCALL(freeFileObject) fo
  #else
! foreign import stdcall "./libHS_cbits.dll" "freeStdFileObject"
freeStdFileObject :: ForeignObj - IO ()
! foreign import stdcall "./libHS_cbits.dll" "freeFileObject"
freeFileObject :: ForeignObj - IO ()
  #endif
  \end{code}

--- 152,159 
  freeFileObject :: ForeignObj - IO ()
  freeFileObject fo = CCALL(freeFileObject) fo
  #else
! foreign import stdcall "libHS_cbits.so" "freeStdFileObject"
freeStdFileObject :: ForeignObj - IO ()
! foreign import stdcall "libHS_cbits.so" "freeFileObject"
freeFileObject :: ForeignObj - IO ()
  #endif
  \end{code}






Re: Haskell 98 progress...

1998-11-13 Thread Jeffrey R. Lewis

Hans Aberg wrote:

 At 10:40 -0800 1998/11/13, Jeffrey R. Lewis wrote:
   Say you've got some code that wasn't originally
 monadic, and you now need to re-express your code in monadic form.  You
 apply the monad translation. Using the `kleisli' functions makes this
 process simpler.  Consider:
 

 It is a fact that every such monadic modification also have such a monadic
 lifting. Could you not implement this fact in such a way that people need
 not bother writing out the details for every specific case?

Do you mean: wouldn't it be nice if haskell had a way to do the monad
translation automatically?  Yes - I think it would be nice.  The `do' notation
provides a partial solution in that it is a monad translation of let
expressions.  Something more ambitious that I've often longed for would be full
monadic reflection (some kind of brackets to enclose monadic compuations and
quoting mechanism to escape them).

--Jeff





Re: Default Default

1998-11-05 Thread Jeffrey R. Lewis

Philip Wadler wrote:

 You are right, beginners may stub their toe on factorial.  On the other
 hand, with the default set to Integer, beginners may stub their toe
 on bad performance.  Anyone familiar with, say, C, will be unsurprised
 by the need to switch Int to Integer to get factorial to work, but will
 be very surprised by the need to switch Integer to Int to get decent
 performance.  -- P

What beginner to haskell writes programs that depend heavily upon integer
performance?  I suspect few.  Who here would argue that most of the
performance hits of using functional programming have to do with integer
performance?  A beginning haskell programmer driven away by poor
performance probably "stubbed his/her toe" on one, or several, of the many
ways that Haskell makes available to the programmer for writing slow
programs, not integer performance.

Despite this, however, I  agree that unless we essentially remove Int from
the prelude, it's better to be consistent (i.e. not have a bunch of common
functions return Int, but have Integer be the default).  I agree with
Simon that now is not the time to completely shake up the prelude, so I
guess I'm with keeping the default as Int.

--Jeff






undocumented feature in GHC-4.00?

1998-10-13 Thread Jeffrey R. Lewis

When attempting to reconstruct the syntax for existential
quantification, I tried:

newtype Groo a = Groo (Either a b)

To my surprise, using ghc-4.00, this worked - without even using
`-fglasgow-exts'.  (it doesn't work, with or without `-fglasgow-exts'
under 3.02)

Then I read the release notes ;-)  These told me about using `forall' on
`data' declarations.  But the above works, and yields the type I was
expecting, i.e. the .hi file sez:

newtype Groo $r3r = Groo (_forall_ [$r3u] = PrelEither.Either $r3r
$r3u) ;

Is this a feature or a bug?

--Jeff




Re: undocumented feature in GHC-4.00?

1998-10-13 Thread Jeffrey R. Lewis

Simon Peyton-Jones wrote:

  When attempting to reconstruct the syntax for existential
  quantification, I tried:
 
  newtype Groo a = Groo (Either a b)
 
  To my surprise, using ghc-4.00, this worked - without even using
  `-fglasgow-exts'.  (it doesn't work, with or without `-fglasgow-exts'
  under 3.02)

 Nothing about existentials here.   GHC is universally quantifying
 over the 'b'.  It's just as if you'd written

 newtype Groo a = Groo (forall b. Either a b)


Indeed - it wasn't the type I was casting about for - I was looking for
how to express:
newtype Groo a = forall b. Groo (Either a b)

This form of quantification seems to only be supported for `data' decls -
is there a reason we can't also do it with `newtype'?


 Perhaps we shouldn't do implicit universal quantification here?

  Is this a feature or a bug?



The confusion on my part was that this form of implicit universal
quantification seems to be an undocumented feature.  In the release notes,
the only comment about implicit quantification is this:


 Notice that you don't need to use a forall if there's an
 explicit context. For example in the first argument of the
 constructor MkSwizzle, an implicit "forall a." is prefixed to
 the argument type. The implicit forall quantifies all type
 variables that are not already in scope, and are mentioned in
 the type quantified over

--Jeff




Multi-param instancing bug

1998-09-18 Thread Jeffrey R. Lewis

GHC 3.02, outa-tha-box.  In the program given below, compiled by:

% ghc  -fglasgow-exts  -c tt.hs -o tt.o

I get the complaint:

tt.hs:22:
No instance for `Conditional Bool [Bool]'
(arising from use of `ifc' at tt.hs:22)

I say there is an instance, given by the last instance decl in the
file.  In fact, if I try to add the following, unconditional instance
decl:

instance Conditional Bool [Bool] where
ifc c x y = map (uncurry (ifb c)) (dist2 x y)

then it gives me the additional complaint:

tt.hs:1:
Duplicate or overlapping instance declarations
for `Conditional Bool [Bool]' at tt.hs:23 and tt.hs:20

Oi!  There's either no instances, or there's too many -- what's it going
to be! ;-)

--Jeff

--

module Buggo where

class Boolean b where
ifb :: b - b - b - b

instance Boolean Bool where
ifb x y z = if x then y else z

class Conditional c a where
ifc :: c - a - a - a
ifc_bogus :: c - a

class Dist2 f where
dist2 :: f a - f b - f (a, b)

instance Dist2 [] where
dist2 = zip

instance (Boolean b, Dist2 f, Functor f) = Conditional b (f b) where
ifc c x y = map (uncurry (ifb c)) (dist2 x y)

t x y z = ifc (x :: Bool) (y :: [Bool]) z




H/Direct buggle

1998-08-13 Thread Jeffrey R. Lewis

Using an H/Direct CVS snapshot from several days ago:

Using ihc -c:

The generated greed-card code imports the prelude qualified, but
references Prelude entities unqualified in %fun decls.  Example fragment
(see the last line):

module BDD
   ( Bdd
   ...
   ) where
import StdDIS
import qualified Prelude
import qualified Addr (Addr)
import qualified HDirect (marshallptr, unmarshallptr)

type Bdd = Addr.Addrtype Bdd_manager = Addr.Addr
bdd_one :: Bdd_manager - Prelude.IO (Bdd)
bdd_one bddm =
  do
bddm - HDirect.marshallptr bddm
o_bdd_one - primbdd_one bddm
o_bdd_one - HDirect.unmarshallptr o_bdd_one
Prelude.return (o_bdd_one)
%fun primbdd_one :: Addr - IO Addr
...

--Jeff




compiling h/direct

1998-08-10 Thread Jeffrey R. Lewis

Compilation fails when entering the lib subdirectory because:

make[1]: Entering directory `/home/src/hdirect-230698/lib'
Makefile:4: ../mk/boilerplate.mk: No such file or directory

And indeed, there is no mk directory anywhere in the distribution.

--Jeff




Re: some Standard Haskell issues

1998-08-07 Thread Jeffrey R. Lewis

[EMAIL PROTECTED] wrote:

 * import and infix declarations anywhere in a module?

   I am against this proposal.  Collecting all such declarations
   at the head of the module is better for human readers.  Allowing
   them anywhere would also complicate and slow down program analysis
   that only requires module dependencies (eg. Haskell-specific `make'
   tools).


It would seem that if we allow infix decls anywhere, shouldn't we be
loosening up the location of import decls also?

Simon's proposal doesn't mention such key issues as what would be the
scope of an infix decl in the middle of a file.

 * layout rules

   A precise specification of the layout rules is clearly desirable.
   But the proposed change `relaxing' the rule for if-then-else seems a
   bit funny.  Isn't it at odds with the original ISWIM motivation for
   having layout rules at all?


I thought the if-the-else proposal seemed odd until I followed the link
and read the exact proposal.  Simon: your if-then-else example on the
Standard Haskell page seems at odds with the actual proposal (e.g. isn't
the point that the `else' itself needn't be indented?)

 * monomorphism restriction

   (Last but hardly least!)   Surely MR qualifies as a trap that
   it would be nice to clean up.  It takes three pages to explain in
   the 1.4 report, and there is plenty of evidence that programmers
   still fall over it frequently.

   Would it be too much/little to require all declaration groups in an
   exporting module to be unrestricted -- a straightforward syntactic
   condition?

  Personally, I'd like to junk the MR, but I don't follow your suggestion?

--Jeff





Re: Felleisen on Standard Haskell

1998-08-04 Thread Jeffrey R. Lewis



   That's just what I intend to do.  I don't see Std Haskell as a big
   deal, but even little deals are worth completing rather than
   leaving as loose ends... and I'm more optimistic than Paul about
   the usefulness of Std Haskell.  I would be happy to find a name
   that was less grand and final-sounding than 'Standard Haskell' though;
   but more final sounding than 'Haskell 1.5'.


Let's drop the name Standard Haskell for now.  I think Haskell 1.5 sounds
just fine, and is much truer to what's actually going on, given Haskell 2
on the horizon.  The name doesn't even need to sound definitive.  The point
is that *we* consider 1.5 a standard, and pledge to support it in our
implementations.  The users of it aren't going to care what the name is, as
long as the textbook examples work ;-)

--Jeff





Re: Scoped typed variables.

1998-07-22 Thread Jeffrey R. Lewis

Alex Ferguson wrote:

  I think the way that Hugs 1.3c handles it would meet your goals.  All that
  it requires is a strict extension to the syntax for patterns to allow type
  annotations.  These can be useful in their own right, but also can be
  applied to problems like the one that you gave:
 
f :: [a] - a - [a]
f ((x::a):xs) y = g y
   where
 g :: a - [a]
 g q = [x,q]

 AKA "Proposal A" in SPJ's recent message on this topic:

 http://www.cs.chalmers.se/~rjmh/Haskell/Messages/Display.cgi?id=274

 I think "A" is fine, it's "B" (and hence, SPJ's Composite Motion, A+B)
 that worries me, for the reasons I alluded to.  If "beefed up A"
 does the job, I'm equally happy as with a more conservation syntax for
 "B".

Just as a sanity check, following an augmented proposal "A" where we can also
annotate the return type as well, consider these:

f :: a - (a - a) - a
f x = \g - (g :: a - a) x

f (x :: a) :: (a - a) - a = \g - (g :: a - a) x

Which of these two is correct, and why?  Why not both?

Next check.  Consider these:

f g = ... (g :: a - a) ...
f (g :: a - a) = ... g ...

Which of these is correct, and why?  Why not both?

--Jeff





Re: Scoped typed variables.

1998-07-22 Thread Jeffrey R. Lewis

Ralf Hinze wrote:

 One could also argue that the culprit is Haskell's interpretation of
 type variables of which Report (p. 34) says: `[...] the type variables
 in a Haskell expression are all assumed to be universally quantified
 [..]'. Here is an even more irritating list of possibilities ...

 and so forth ... A solution could be to consider the type variables
 universally quantified at the _outermost_ possible level (currently
 it's the innermost). So `f e1 ... en = e' means `forall a1 .. am.f e1
 ... en = e' where the `ai's are the type variables occuring free in the
 definition. If we had explicit syntax for universal quantification
 (which I consider absolutely necessary) the former interpretation could
 be recovered using explicit quantifiers: ... (f :: forall a.a - a)
 ...


  This sounds great, but it could break old code, of course.  This would have a
different type under Ralf's proposal than Haskell 1.4:
(id :: a - a, id :: a - a)
However, I think something like it is the only sane way to go.  Whatever we do, type
variables should scope consistently.  With proposal "A" as is (such that it wouldn't
break old code, and just like in hugs 1.3c), a type variable would scope differently if
it was in a pattern verses being in an expression.  Ralf's proposal fixes that nicely,
and I don't think the cost in old code here would be very high.

--Jeff





Re: Monomorphism

1998-07-15 Thread Jeffrey R. Lewis

John C. Peterson wrote:

 While some may argue that avoiding re-evaluation is the justification
 for the monomorphism restriction, the real issue is ambiguity.
 Without some form of monomorphism (or scoped type variables??  Someone
 needs to convince me that scoped type variables fix everything ...)
 you quickly run into some serious problems.  For example, consider:

 read :: Read a = String - a
 read s = let [(r,s')] = reads s in r

 This *won't compile* if you don't treat the let binding definition
 monomorphicly.  Without monomorphism, the types of r and s' are

 r  :: Read a = a
 s' :: Read a = String

 This leads to an ambiguity error for s'.

I'm not buying it (the DMR being responsible for resolving overloading
ambiguity).

There's nothing ambiguous in that definition, because s' is not used.
There's nothing even ambiguous about the meaning of s' (we can apply the
dictionary translation just fine).  The problem is with uses of things
like s'.  Haskell takes the stance that you can't declare something you
can't use (something with a type that would lead to ambiguity in use),
but I say this example shows us how that stance may well be shortsighted.

--Jeff

P.S.  C'mon, John.  Don't you want to write your example as:

read s = let f () = reads s in fst (head (f ()))

;-)

(gratuitous () param to f thrown in to show we're not bothered by the
DMR)





Re: let succ be an Enum class member

1998-05-12 Thread Jeffrey R. Lewis

Christian Sievers wrote:

 Hello, this is about a change in the prelude I'd like to suggest.

 The following is obviously not what one would expect:

   Prelude succ 100::Integer

   Program error: {primIntegerToInt 100}

 However, with the prelude defining succ to be

   succ,:: Enum a = a - a
   succ =  toEnum . (+1) . fromEnum

 we can't hope for anything better.

 My suggestion is to make succ a member function of class Enum, with
 the old definition as default, so that the instance Enum Integer
 can define it to be simply (+1).

 Another example is
   data Nat = Zero | Succ Nat
 where succ=Succ is not only more natural, but also much more
 efficient.

 Of course the same holds for pred.

 What do you think?
 Are there any drawbacks?
 Could it be like this in standard haskell?

I agree, in fact, I'd go one stronger.  I'll propose that `fromEnum' and
`toEnum' be taken out, since clearly not all enumerable types are
subtypes of Int (as you point out, Integer leaps immediately to mind).

As an experiment, a little while back,I modified the Prelude to use an
Enum class that included `succ' and `pred', but eliminated `fromEnum'
and `toEnum'.   I then had to eliminate all uses of `fromEnum' and
`toEnum'.  I found that, in every case, `fromEnum' and `toEnum' were
used at specific types (mostly Char), and popped in the suitable
type-specific replacement.  I also had to change hugs to `derive' `succ'
and `pred' instead of `fromEnum' and `toEnum', but this wasn't
difficult.

This certainly makes the Enum class cleaner conceptually, and I can't
think of any drawbacks, except for concern about legacy code (I love
that concept for a young language like Haskell ;-).

--Jeff





Re: Binary, Conversions, and CGI

1998-05-01 Thread Jeffrey R. Lewis

Alastair Reid wrote:

  3. CGI startup and HUGS
  Hugs scripts seem to take a very long time to start up.
  Testing with this:
   main = putStr "content-type: text/html\n\nhello world\n"
  Hugs scripts have a noticeable response delay.
  The equivalent Perl, C, and GHC executables all respond with negligible
  delay. Is there any way to speed up Hugs scripts other than waiting for a
  new version?

 Compile them with GHC :-)

 If you're really keen (aka masochistic) you could modify Hugs to dump a
 snapshot of itself and its memory contents after loading a file - the
 way that TeX and emacs do.  Details on how to do this for various
 architectures can be found in the TeX and emacs source code.

To add a little voice of experience here.  I have tried that very experiment.
I was able to get hugs to snapshot itself by imitating how emacs does it.
This much was fairly "straightforward" (in a grungy Unix hackers sort of
sense).  However, runhugs was a different beast.  I could never get it to
snapshot correctly.  I never determined what the problem was, but then, when I
did this, runhugs had a number of problems (it was very new) which may be
resolved by now.  I could dig up the code if anyone is interested.

--Jeff





Re: Indicting Haskell

1998-04-02 Thread Jeffrey R. Lewis

S. Alexander Jacobson wrote:

 On Thu, 2 Apr 1998, Fergus Henderson wrote:
  On 01-Apr-1998, S. Alexander Jacobson [EMAIL PROTECTED] wrote:
  
   Not having written a compiler before, this may be an ignorant question,
   but how come you write these Haskell compilers/interpreters in C/C++
   rather than bootstrapping from prior versions of Haskell itself?
 
  Gofer (and hence Hugs) was written in C for efficiency.
 
  ghc was written in Haskell, and hence is big, fat, and slow.
  Please correct me if I am wrong ;-)

 I realize this was written in good humor, but doesn't it serve as an
 indictment of Haskell as a general purpose language?  I believe the Gnu
 compilers are bootstrapped.

Only if you wish to judge the language by the very small handful of initial
implementations.  The Gnu compilers are big and fat, but not so slow because
the language being compiled is comparatively close to the underlying machine
architecture.  Also, the art of compiling languages like C is very well
developed.  And let's not forget the hordes of people who contribute to the
Gnu compilers, as opposed to the very small number working on Haskell
compilers.

I don't think there's any need to lose faith based on the evidence of hugs and
GHC.   I suspect when hugs was written that GHC was simply not a viable
implementation platform.  However, I further suspect that GHC has come a long
way since then.  Perhaps the exercise of implementing the hugs interpreter in
Haskell should be tried now.  I'm game for the exercise - anyone else?

--Jeff





Re: strictness of List.transpose

1998-03-31 Thread Jeffrey R. Lewis

Jonas Holmerin wrote:

 The other day, I tried to transpose an infinite list of finite list:
 Simplified example:

 transpose (repeat [1..5])

 This won't terminate, since transpose is defined as

 transpose   :: [[a]] - [[a]]
 transpose   =  foldr
  (\xs xss - zipWith (:) xs (xss ++ repeat []))
  []

 The evalutation goes something like this:

 foldr (\xs xss - zipWith (:) xs (xss ++ repeat [])) [] (repeat [1..5]) =

 zipWith (:) ([1..5])
 (foldr (\xs xss - zipWith (:) xs (xss ++ repeat [])) [] (repeat [1..5]))

 Which will loop forever since zipWith is strict in its third argument.

Hmm... indeed.   I wonder if there's any reason why zipWith can't just be fully lazy
so that we don't need to twiddle with transpose.  I.e., define it as:

zipWith   :: (a-b-c) - [a]-[b]-[c]
zipWith z ~(a:as) ~(b:bs) = z a b : zipWith z as bs
zipWith _ _  _= []

Perhaps this is an oft visited topic.

--Jeff






Re: Phil's Pretty Printing Library

1998-03-27 Thread Jeffrey R. Lewis

--8556961D98DF73F92C98D9B2
Content-Type: text/plain; charset="us-ascii"

Tommy Thorn wrote:

 In an attempt to gather experience with Philip Wadlers pretty printing
 combinators, I converted a semi big pretty printer to his pretty
 printing.  My initial experiences are positive: while Phil's
 combinators are clearly less expressive, they were expressive enough
 (at least with the improvements below) and the results was subjectly
 clearer.

 For the utility of others which might wish to experiment I include my
 library, apologizing up front from the lack of clean-up,
 documentation, and optimization.

 On an unrelated note, I like to add that I consider it a feature, not
 a bug, that combinators build a Doc structure, as this enables a later
 unforseen interpretation of the document.  Such an interpretation
 could be for proportional fonts, colors, etc.

 /Tommy

I've also been working with Phil's combinators.  The limitations are easily
overcome by the addition of an operator for setting a new left margin.  The
`tab' operator (which I'll admit is probably not the best name) sets the left
margin to the current position, over the scope of its argument.  We can then
easily express Hughes' horizontal concatenation combinator as
x  y  =  x : tab y
This I think gives us the best of both Hughes' and Wadler's pretty printers:
simplicity, expressiveness, and efficiency.  Wadler's pretty printer has the
interesting property that layout of the current line does not effect layout of
subsequent lines.  Thus, pretty printing is optimal using the greedy strategy
of chosing the best layout one line at a time.  The addition of `tab' breaks
this.  However, that just puts us on the same footing as Hughes' combinators,
while gaining the expressiveness of Hughes' combinators, and keeping the
additional expressiveness of Wadler's combinators.

In addition, I've added an `indent' operator.  This indents out n spaces from
the current left margin, introducing a newline if necessary (i.e. if the
current position is already past the requested indentation point).  Using
`indent', we can easily express Hughes' `nest':
nest i x  =  indent i : tab x
`indent' makes it easy to layout the following style:
x  = y
   + z
verylongidentified
   = q

I'm currently implementing a pretty printer for Haskell using these
combinators.  I've included a current draft of my combinators for your
amusement ;-)

--Jeff

--8556961D98DF73F92C98D9B2
Content-Disposition: inline; filename="PrettyGo"
Content-Type: text/plain; charset="us-ascii"; name="PrettyGo"

module Pretty where

infixr 6 :

-- the simple concatentation operator
-- chosen not to conflict with Hughes
infixr 6 :

-- for emulating Hughes
infixr 6 
infixr 6 +
-- I like this much better than $$
-- and it doesn't give hugs fits (hugs uses $$ as a top-level macro)
infixr 5 //

data Doc =
  Empty
| Line
| Indent Int
| Text String
| Nest Int Doc -- not used at present
| Tab Doc
| Doc : Doc
| Group Doc
deriving Show

empty = Empty
line = Line
indent i = Indent i
text s = Text s
tab x = Tab x
x : y = x : y
group x = Group x

-- `line : line', or `line : indent i' should flatten
-- to a single space.  In the following, the flag q (`quiet')
-- is used to indicate when additional whitespace should be
-- suppressed

-- layout simple forms
prSimple :: (Doc,Int,Bool) - Bool - Int - (String,Bool,Int)
prSimple (Empty, l, f) q k =("", False, k)

prSimple (Line, l, True) True k =   ("", True, k)
prSimple (Line, l, True) False k =  (" ", True, k + 1)
prSimple (Line, l, f) q k = ('\n' : copy l ' ', False, l)

prSimple (Indent i, l, True) True k =   ("", True, k)
prSimple (Indent i, l, True) False k =  (" ", True, k + 1)
prSimple (Indent i, l, f) q k =
if k  l + i then
('\n' : copy (l + i) ' ', False, l + i)
else
(copy (l + i - k) ' ', False, l + i)

prSimple (Text s, l, f) q k =   (s, False, k + length s)

pr :: Int - Doc - (String,Bool,Int)
pr w x = prComb (x, 0, False) [] True 0
where
-- layout combining forms
-- the second arg is an accumulator for flattening out compositions
-- w is in scope here, so this isn't a top-level def
prComb :: (Doc,Int,Bool) - [(Doc,Int,Bool)] -
  Bool - Int - ([Char],Bool,Int)
prComb (Nest i x, l, f) ys q k =
prComb (x, l + i, f) ys q k
prComb (Tab x, l, f) ys q k =
prComb (x, k, f) ys q k
prComb (x : y, l, f) zs q k =
prComb (x, l, f) ((y, l, f) : zs) q k
prComb (Group x, l, True) ys q k =
prComb (x, l, True) ys q k
prComb (Group x, l, False) ys q k =
let (s, q', k') = prComb (x, l, True) ys q k in
--if fits s (w - k) then
if fits s (w - k) (w - l) 5 then
(s, q', k')
else
prComb (x, l, False) ys q k
prComb x [] q k = prSimple x q k
prComb x (y : ys) q k =
   

GHC 3.01: glitch in install

1998-02-20 Thread Jeffrey R. Lewis

On a sparc, Solaris 2.5.1, during make boot, I encounter the following
minor glitch:



==fptools== gmake  boot --no-print-directory -r;
 in /amd/church/projects/pacsoft/E/haskell/ghc-3.01/ghc/driver


../utils/unlit/unlitghc-asm.lprl ghc-asm.prl
../utils/unlit/unlitghc-iface.lprl ghc-iface.prl
../utils/unlit/unlitghc-consist.lprl ghc-consist.prl
../utils/unlit/unlitghc-split.lprl ghc-split.prl
../utils/unlit/unlitghc.lprl ghc.prl
rm -f ghc-3.01
Creating ghc-3.01...
echo "#! "/usr/local/bin/perl  ghc-3.01
Done.
ln -s ghc-3.01 ghc
ln: cannot create ghc: File exists
gmake[2]: *** [ghc] Error 2
gmake[1]: *** [boot] Error 2
gmake: *** [boot] Error 2