Re: some Standard Haskell issues

1998-08-19 Thread Simon L Peyton Jones

 Yes, I think it's a fine idea to loosen up the syntax and allow import and
 infix anywhere.  But could someone clarify what the intent is with regards to
 the scoping of liberally sprinkled imports/infixes?

I've added a clarification; my intent was that all ttimport/tt and tt
fixity/tt declarations would scope over the whole module, just like any other
declaration.

Simon






Re: Felleisen on Standard Haskell

1998-08-04 Thread Simon L Peyton Jones

 In any case, I hope that Simon will follow his urge to get Standard
 Haskell done with Real Soon Now, even if there is no overwhelming
 consensus on certain issues, so that we can then concentrate on Haskell
 2.

That's just what I intend to do.  I don't see Std Haskell as a big
deal, but even little deals are worth completing rather than
leaving as loose ends... and I'm more optimistic than Paul about
the usefulness of Std Haskell.  I would be happy to find a name
that was less grand and final-sounding than 'Standard Haskell' though;
but more final sounding than 'Haskell 1.5'.

Matthias's points are good ones, but premature for Haskell I think,
which is still moving pretty fast.  Indeed, for some reason, faster
of late.

Simon





Re: Rambling on numbers in Haskell

1998-08-04 Thread Simon L Peyton Jones


I think all this discussion about numerics in Haskell is great. I'm convinced
that designing good libraries is a major creative act, not just an add-on to a
language; and that the existence of good libraries has a big effect on how much
use a language gets.  ('Good' means both having a well-designed signature, and
at least one efficient implementation. Sometimes these two conflict.)  Library
design is sadly under-valued: people who design and implement good libraries
should get at least as much credit as people who design good languages.

The trouble is that it's easier to identify flaws in current libraries
than to figure out better ones. Certainly in the numeric area there sems
to be a big tension between completeness and simplicity.

Bottom line: Haskell 2 is pretty wide open.  If some group of you
out there get excited about this and design and implement a coherent
set of numeric libraries then I think you'd be welcomed with open
arms, even if it wasn't compatible with Haskell's current numeric
class structure.   Sergey's work is moving in that direction, which
is great. 

Above all don't leave it to us compiler hackers!  We aren't 
numeric experts like some of you are, and are most unlikely to do a
good job of revising Haskell's numeric libraries.

Simon







Re: instances in Haskell-2

1998-07-29 Thread Simon L Peyton Jones

 I cannot find there the subject. Could you citate?

Sorry, it turns out I missed your point entirely.  

  class (Ring r,AddGroup (m r)) = RightModule m r where  
cMul :: m r - r - m r

So here, m :: *-*

What you really want is to say

  instance Ring r = RightModule (\t-t) r where
 cMul = mul

so that now you can use cMul at type

cMul :: Foo - Foo - Foo

(provided Foo is an instance of Ring).



The '(\t-t)::*-*' is your 'transparent newtype' function.  It makes
perfect sense to have lambda abstractions at the type level;
indeed type synonyms define such things.  I've seen quite a few
other occasions in which I wanted a (\t-t) type constructor 
in particular.

What I don't know how to do is to perform type inference on such systems.  (The
restriction that type synonyms must be fully applied is to avoid lambdas in
type inference.)  There is some work on it, but I believe the story is "It's
complicated".


So, yes it's a reasonable question, but I for one don't know
a tractable design.  And it's known swampy territory.

sorry for missing the point the first time.

Simon





Re: suggestions for Haskell-2

1998-07-28 Thread Simon L Peyton Jones

   class (Ring r,AddGroup (m r)) = RightModule m r  
 where  
 cMul :: m r - r - m r
  -- "vector" (m r) multiplied by "coefficient"  r'

 Haskell rejects this (m r) in the context.  Could Haskell-2 allow it?

Yes.  See http://www.dcs.gla.ac.uk/~simonpj/multi-param.html

   instance Ring r = RightModule r r  where  cMul = mul

 Haskell rejects this  `= RightModule r r'

This is fine too (for Haskell 2). See the same URL.


 `newtype' cannot derive all the instances automatically.
 
 Thus, in our case,  add (Id 1) (Id 2)  is illegal.
 
 This may occur a stupid question, but 
   why Haskell allows the `newtype' derivation only for the standard 
   classes?
 Why not support declarations like
 
  newtype N a b =  N (T a b)  deriving(Eq,Ord,AddGroup,Ring)
 or   newtype N a b =  N (T a b)  deriving( all )

That would indeed be possible for newtype; but my guess is
that if you want 'all' then you ought to be able to get away without
a newtype at all.  But I might well be wrong about this.
Anyway, it's not an unreasonable suggestion.

Simon





Re: avoiding repeated use of show

1998-07-22 Thread Simon L Peyton Jones

 I would like to avoid using show all the time for printing strings e.g.
 
  val = "the sum of 2 and 2 is "++(show $ 2 + 2)++" whenever."
 
 I would prefer to type something like:
 
  val = "the sum of 2 and 2 is "./(2+2)./" whenever." 
  -- i can' find a better haskell compatible operator

Let me advertise Olivier Danvy's very cunning idea to implement
printf in Haskell/ML.

http://www.brics.dk/RS/98/5/index.html


 So I tried creating my own Stringable class:
  class Stringable a where
   toString::a - String
 
  (./) :: (Stringable a,Stringable b)= a-b-String
  x./y = (toString x)++(toString y)
 
 The trouble is that when I try doing things like:
 
  res = (2+2) ./ " hello"
 
 I get an "Unresolved top-level overloading" error.

 Is there any way to convince Haskell to just resolve these numbers to
 SOMETHING by default?  Then I can just declare that type an instance of
 Stringable.

What's going on is this.  You've got the context

(Stringable a, Num a)

arising from the RHS of res, and no further info about what "a" is.
Under extremely restricted circumstances Haskell will choose 
particular type for you, namely

when the classes involved are all standard prelude classes
and at least one is numeric.

Why so restrictive?  Because it it *might* make a massive difference
which type is chosen.  Suppose you had

instance Stringable Int where
   toString n = "Urk!"

instance Srringable Integer where
   toString n = ".Ha ha."

Now whether your program yields "Urk" or "...Ha ha" depends on
which type Haskell chooses.

There's no technical issue here. One could relax the defaulting
restriction, at the cost of (perhaps) sometimes unexpected behaviour.
Or, as you show, you can just tell it which type to use:

res = (2+2)::Int ./ "hello"

Simon 






Re: Scoped typed variables.

1998-07-22 Thread Simon L Peyton Jones

 I think the way that Hugs 1.3c handles it would meet your goals.  All that
 it requires is a strict extension to the syntax for patterns to allow type
 annotations.  These can be useful in their own right, but also can be
 applied
 to problems like the one that you gave:
 
   f :: [a] - a - [a]
   f ((x::a):xs) y = g y
  where
g :: a - [a]
g q = [x,q]
 
 The only change I've made here is to replace "x" on the left hand side of
 the definition for f with "(x::a)".  As a result, the type variable "a"
 will be in scope when the signature of g is encountered, and so will not
 be subjected to the usual, implicit universal quantification.

The monomorphism discussion highlighted a disadvantage with
the pattern notation for scoped type variables that I hadn't realised
before.  Michael suggested

  
 f :: [a] - c
 f xs = if len  fromInteger 3 then len else 0
  where
  len :: c
  len = length xs

This relies on the 'c' from the type signature scoping over
the definition, which is on alternative notation for scoped
type variables.  On the whole I think the 'put signatures in patternss'
approach is nicer, but I don't think it can express this example,
because the relevant type is (only) in the result.  Maybe it's
Just Too Bad, but it is a pity.


Simon





Re: Monomorphism

1998-07-21 Thread Simon L Peyton Jones


Olaf suggests

 Hence I suggest that part (b) of rule 1 of the MR should
 be deleted, i.e. simple
 pattern bindings are just treated as function bindings. As I have said in a
 previous email, the recomputation issue could be handled by warnings from the
 compiler.

That would indeed not fall foul of the ambiguity problem, but
it doesn't deal with the unexpected efficiency problem.

Consider:

f xs = if len  3 then len else 0
 where
len = length xs

You are a first year undergraduate.  Quick, how many times is the
length of xs computed?

Correct!  Twice!  I'm assuming here that we've made the change
to length to give it the sensible type

length :: Num b = [a] - b

(And it's not length's fault; it's easy to write functions with
similar types.)

Here the effect is arguably minor: taking the length of a list
twice.  But it's not hard to come up with examples that are
exponentially worse.

Some people say "of course len is computed twice", but many find
it puzzling.  We have debated this point endlessly.  Frankly, I'm
very reluctant to make incremental changes to the MR.  Let's either
fix it or leave it alone.  Since we don't know how to fix it, let's
leave it alone.

Simon





Re: GHC licence (was Could Haskell be taken over by Microsoft?)

1998-07-21 Thread Simon L Peyton Jones

 Simon L Peyton Jones wrote:
  So far as GHC is concerned, I wrote on this list a month ago:
  "More specifically, I plan to continue beavering away on GHC.
  GHC is public domain software, and Microsoft are happy for it to 
  remain so, source code and all.  If anything, I'll have quite a bit
  more time to work on it than before."
 
 Do you mean "public domain" literally, i.e. are you renouncing all
 copyright?  (The source code contains copyright notices, but no
 licence, as far as I can see.)

No I am not renouncing all copyright.  By "public domain" I mean freely
available for anyone to use for any purpose other than making money
by selling the compiler itself.  That isn't a formal definition,
but I'm sure you see the intent.

I have carefully avoided getting tangled up in legal red tape, which
is why there is no formal license.  It may be that my move to Microsoft
will force me to spend time sorting this out.  But it's never been
a problem so far, and I doubt it will in the future, so I'm reluctant
to invest the time until pressed to do so.

Simon





Re: Monomorphism

1998-07-21 Thread Simon L Peyton Jones

 
 I'm going to ask a very stupid question.
 
 Why on earth is len computed twice in this example?  I really don't   
 understand this!

I have to confess that I mischievously hoped that someone
would say this: it demonstates the point nicely that
lifting the monomorphism restriction would cause at least
some people to be surprised.

Let me add the type signatures [I'm a bit puzzled why length used
to have type Integral b = ... but that's a side issue.]


   length :: forall a b. Num b = [a] - b


   f :: forall a c. Num c = [a] - c
   f xs = if len  (3::Integer) then len else 0
   where
 len :: forall d. Num d = d
 len = length xs


The first use of len returns an Integer, so we must compute
the length at type Integer.

The second use of len returns a value of type c, so we must
compute length at type c.  (Someone *might* call f wanting an Integer
back, they might not.)

And there you have it.

Simon





Re: GHC/Hugs Status (was Re: simple interface to web?)

1998-07-17 Thread Simon L Peyton Jones


 But if there are too many things missing, no one will use Standard
 Haskell - it already seems as if most of the people on this list are
 going to go straight to Haskell 2, which would mean that Standard
 Haskell might only be used for teaching.

Indeed, I do expect that most of the people on this list will
go straight to (the moving target of) Haskell 2.  The purpose of
Std Haskell is to address the needs of people who don't need the
latest greatest, but do need something stable.  For example,
the fact that Haskell keeps moving (which in many ways is good)
discourages people from writing books, which in turn makes it
less attractive for people who aren't already converted.

 thought this could be specified pretty easily), but I really think
 that the tidying up of the relationship between the prelude and the
 standard libraries is vital. For example, I can see no reason why
 PreludeIO and IO should be separate.  Surely it wouldn't be much work
 to put it all in the IO library? As it stands, the prelude has to
 refer to the library, which, I think, underlines the inconsistency:

I agree that the Prelude could do with tidying up, but I think
it is no small task, and (worse) it is a task with no clear finish.

I do propose (unless people yell) to move back to the generic
take/drop/length operators, but if it's left to me not much else
will happen. On the other hand, if someone is keen to work on the
Prelude and standard libraries, in the next two or three months
(no longer) then I'd be delighted to hear from them.

Simon "desparately trying to keep the lid on the can of worms" PJ






Re: Monomorphism

1998-07-16 Thread Simon L Peyton Jones


  read :: Read a = String - a
  read s = let [(r,s')] = reads s in r
 
  This *won't compile* if you don't treat the let binding definition
  monomorphicly.  Without monomorphism, the types of r and s' are
 
  r  :: Read a = a
  s' :: Read a = String
 
  This leads to an ambiguity error for s'.
 
 I'm not buying it (the DMR being responsible for resolving overloading
 ambiguity).
 
 There's nothing ambiguous in that definition, because s' is not used.
 There's nothing even ambiguous about the meaning of s' (we can apply the
 dictionary translation just fine).  The problem is with uses of things
 like s'.  Haskell takes the stance that you can't declare something you
 can't use (something with a type that would lead to ambiguity in use),
 but I say this example shows us how that stance may well be shortsighted.

What worries me is that a programmer might write this:

read2 :: (Read a, Read b) = String - (a,b)
read2 2 = let [(r1,s1)] = read s
  [(r2,s2)] = read s1
  in
  (r1,r2)

Here, s1 is indeed used, so the ambiguity problem does bite.
The troubling thing is that "read" works (at least if the MR
is lifted) but "read2" does not.

I am concerned too about the efficiency problem.  Suppose that
the ambiguity problem was resolved by choosing some arbitrary
type.  (A bad idea, but still.)  And suppose that we continued
"read2" so that it read, say, 10 things.  Reading the 10th would
re-parse all previous 9, which is surely not what a programmer
might expect.  The redexes (read sn) certainly look like thunks,
which we are accustomed to being shared.  But actually it will
be evaluated once for each used variable in the LHS pattern.

Simon





Re: GHC/Hugs Status (was Re: simple interface to web?)

1998-07-15 Thread Simon L Peyton Jones

 etc... all seem to be things that are waiting 'till Haskell 2.  My
 point was that _something_ should be in Standard Haskell.  The features
 you mention are likely to help when writing a better network library,
 but let's not get distracted from the option of including something
 straightforward in the standard.

I hadn't realised that your suggestion was a propos of Standard
Haskell.

I'm pretty leery about trying to agree any new libraries at this
stage, unless someone comes up with a worked-out, and implemented,
specification pretty quickly.  The name of the Std Haskell game
is rapid closure.  There are just tons of things that 'ought'
to be in it that aren't going to be.

Simon





Re: GHC/Hugs Status (was Re: simple interface to web?)

1998-07-15 Thread Simon L Peyton Jones

 More generally, regardless of the standards process, it feels like the
 GHC, Hugs define the de facto Haskell standard (it doesn't look like HBC
 is still in progress but I could be wrong).  As such, it seems tough to
 write libraries right now as the upcoming GHC/Hugs release will contain
 features that strongly affect library design:
 
 So I guess my question is how close is this new release? 

For these

 * multi-parameter type classes
 * existential types
 * exceptions
 * mutually recursive import [GHC does this ok]

we'll have a shiny new GHC beta release before ICFP (end Sept).
(I.e. it's pretty much working now.)  Don't know about Hugs.

For these

 * require/ensure/invariant assertions (my optimism)

We need an agreed design first.  Your point about putting
asertions in type signatures makes me think that maybe assertions
would be premature for Std Haskell.

 * module signatures/dynamic linking
 * dynamic linking (serialization/persistence?)

I'm hazy about what you have in mind here.

Simon






GHC 3.03

1998-07-10 Thread Simon L Peyton Jones


Folks,

We've fixed a few bugs in GHC 3.02, thanks to useful bug reports
from several of you.  I'd rather not release an 'official' 3.03 because
it takes half a day to do a full release, and meanwhile we have
a substantial new compiler in the works (new RTS, CAF space leaks
squashed, new GC, new Core data type, existential types...).
This compiler is our first big step towards the integrated GHC/Hugs
system.  

Meanwhile, if you need the bug fixes you can get them easily either
from the nightly build or from the repository.  Both are now
under 'Getting GHC' on the GHC home page.

Let us know if you are stuck.

Meanwhile, back to those existentials...

Simon





Re: type synonyms

1998-07-09 Thread Simon L Peyton Jones


 Why does ghc allow to define instances of type synonyms? I did not find any
 remarks about that in the ghc docs.
 Due to the report this isn't allowed and hugs rejects it correctly.

GHC allows arbitrary non-overlapping types in instance decls.
Thus:

instance C ([(Int,Bool)] where ..
instance C ([(Bool,Int)] where ...
instance C (Tree [[a]]) where ...
etc

I suppose the generalisation of Haskell 1.4's rule would be
to disallow type synonyms anywhere in the type.  But, instead
I simply allowed arbitrary types *including* 
type synonyms with the usual meaning: writing
the type synonym is absolutely identical with writing the type.
For example 

type MyDict = [(Int,Bool)]

instance C MyDict where ...

is exactly the same as

instance C ([(Int,Bool)] where ..


There's a stylistic point: since they 'look' different,
should Haskell prevent you writing the synonym one?  Well
we don't in type signatures, so I decided not to in instance
decls either.

I'd better add this to my MPTC notes.

Simon




Re: type synonyms

1998-07-09 Thread Simon L Peyton Jones


 That's basically newtype with the data constructor omitted (I would
 prefer data to record). Unfortunately, this seems to be incompatible
 with the class system. (There was a long discussion on the Standard
 Haskell discussion list, unfortunately the entry vanished).

No, it just moved over to decisions:

http://www.cs.chalmers.se/~rjmh/Haskell/Messages/Decisions.cgi

the particular one is this:

http://www.cs.chalmers.se/~rjmh/Haskell/Messages/Decision.cgi?id=443

Simon





Re: 3.02,multipar,overlaps

1998-07-08 Thread Simon L Peyton Jones

 ghc-3.02-linux-i386-unknown...  from  ftp.dcs.gla.ac.uk.../3.02/
 
 cannot link the program enclosed:
 
 ghc -c -fglasgow-exts -optC-fallow-overlapping-instances -v Main.hs   log 
 ghc -o run Main.o
 
 Main.o(.text+0x3a3): undefined reference to `Main_ZcMBConvertible_inregs_info'
 Main.o(.data+0x3c): undefined reference to `Main_ZcMBConvertible_static_info'

Thanks for a fine bug report Sergey.  I am embarassed to say
that it's a definite bug, a consequence of a late 'fix' (ha!).

It will show up for any class with just one member, I'm afraid.
Work around it by adding a second unused class operation to the
class.

We'd better release 3.03, I can see.

Simon




Standard Haskell

1998-07-08 Thread Simon L Peyton Jones


Folks

This message is to update you on the state of play so far
as Standard Haskell is concerned.  I'm circulating to three
Haskell-related mailing lists; in future I'll mail only
the "haskell" list, so pls subscribe to it if you want to
see anything more.

You may remember that John Hughes has been running the
Standard Haskell process, but he's been unable to devote much
time to it lately, so I've agreed to take on that role.

I've summarised the current state of play at

http://www.dcs.gla.ac.uk/~simonpj/std-haskell.html

The introductory paragraphs from that page appear below,
but the details don't.   As you'll see, the idea is that 
Standard Haskell should be a rather small increment (or decrement)
from Haskell 1.4, and I'm keen to get the thing done fairly
quickly.  (I'd be more precise, only my summer is a bit uncertain
because of my move to Cambridge.)

The summary page says what's decided, what's about to be decided,
and what isn't decided.  I'd be interested in your opinions about
any of these things.  Please send them to me or any other member
of the Std Haskell committee (below).  You're also free to send
messages to the Haskell mailing list, of course.

Simon


==

Standard Haskell: State of Play

Following the 1997 Haskell Workshop, much intervening work by the
Standard Haskell committee, and then an informal meeting at
Working Group 2.8 (Oregon, April 1998), 
there are now two separate Haskell language-design efforts:
Standard Haskell, and Haskell 2.

* Standard Haskell is, as was originally agreed at the 1997 Haskell
Workshop, a minor revision of Haskell 1.4, cleaning up traps but NOT adding
major new functionality. and thereby give Haskell the stability that has so far
been lacking.

* Over the past year it has become clear that there are a large number
of interesting proposed developments to Haskell: multi-parameter type
classes, pattern guard, scoped type variable, local universal and
existential quantification, and so on.  John Launchbury is
leading the development of bHaskell 2/b, which will embody
these extensions.  Much of the discussion among the Standard Haskell committee
will transfer into the Haskell 2 effort.



Standard Haskell will therefore by no means be the last revision of Haskell. On
the contrary, we design it knowing that it will be superceded within one or two
years. But it will have a special status: bthe intention is that Haskell
compilers will continue to support Standard Haskell (given an appropriate flag)
even after later versions of the language have been defined/b, and in that
sense the name `Standard Haskell' won't refer to a moving target.

It's clear that those of us who want to keep up with the latest developments
will soon move to Haskell 2, but it also seems clear that there is a
substantial user community who will not. This strategy serves both. I think
another advantage is that the `dusty deck' problem will be much less of an
issue in the design of Haskell 2.0, because the dusty decks will remain in
Standard Haskell. So this strategy also gives a once-only opportunity to make
language changes (in the step to Haskell 2) which ido/i break a lot of old
code. Strict pattern matching may yet raise its ugly head again...!



==
John Hughes, [EMAIL PROTECTED]
Lennart Augustsson , [EMAIL PROTECTED] 
David Barton , [EMAIL PROTECTED] 
Richard Bird , [EMAIL PROTECTED] 
Ralf Hinze , [EMAIL PROTECTED] 
Paul Hudak , [EMAIL PROTECTED] 
John Launchbury, [EMAIL PROTECTED] 
David Lester , [EMAIL PROTECTED] 
Jeff Lewis, [EMAIL PROTECTED] 
Erik Meijer , [EMAIL PROTECTED] 
John Peterson , [EMAIL PROTECTED] 
Simon Peyton-Jones , [EMAIL PROTECTED] 
Colin Runciman , [EMAIL PROTECTED] 
Simon Thompson , [EMAIL PROTECTED] 
Phil Wadler , [EMAIL PROTECTED] 






Re: type errors

1998-07-01 Thread Simon L Peyton Jones

  Actually I think you would be better off with a class like
  this:
  
class (Eq key, Ord key) = Dictionary dict key where
   delete :: key - dict dat - dict dat
   search :: key - dict dat - (key, SearchResult dat, dict dat)
   searchList :: [key] - dict dat - ([(key,SearchResult dat)],dict dat)
 
 Ok, I did not reconize this solution, it seems to me the (nearly) proper one.
 But why not write:
 
class = Dictionary dict where
delete :: (Eq key, Ord key) = key - dict key dat - dict key dat
...
 
 So one could avoid multiparamter classes at all. The two types key and dat
 should be inferred by the type of dict (which is expressed by 'dict key dat'). I
 can't think about a dictionary where key or dat are not associated with dict.

Yes, that's ok too.  You might find my paper "Bulk types with class"
somewhat relevant.

http://www.dcs.gla.ac.uk/~simonpj/papers.html

Simon





Re: type errors

1998-07-01 Thread Simon L Peyton Jones

 |   class (Eq key, Ord key) = Dictionary dict key dat where
 |delete :: key - dict - dict
 | ...
 |  the first error:
 |  
 |  Class type variable `dat' does not appear in method signature
 |  delete :: key - dict - dict
 |  
 |  Why does ghc expect that I use all of the type variables?
 |  Obviously I only need
 |  the key to remove an entry of a dictionary.
 | 
 | You're right.  The restriction is excessive.  Thanks for pointing
 | this out.  Probably we should only require that at least one
 | of the class variables is constrained.
 
 I don't see this.  The fact is that, if any of the variables in the
 class header don't appear in the type of a method, then that method
 will always have an ambiguous type.  In this case, that would be:
 
delete :: (Dictionary dict key dat) = key - dict - dict

It would not *always* result in ambiguity. For example, consider

instance Dictionary (MyDict dat) Int dat where ...

f :: MyDict dat - MyDit dat
f d = delete (3::Int) d

Here, the polymorphism in 'dat' doesn't affect which instance
is chosen.

However, on further reflection, I now think:

in any case where ambiguity would not result
the original class declaration should be re-formulated

So, I now think that the existing rule (all class variables
must appear in each class-operation type signature) is probably
the right one, but on stylistic rather than technical grounds.

 I don't see how Simon's suggestion of replacing "all" with "at least one"
 could be made to work in general.  Such an approach is however possible
 if you are working with multiple parameter classes where the variables in
 one parameter are determined directly by the parameters in another.
 For example, consider the following class:
 
class Collection elem col where
empty :: col
add   :: elem - col - col
del   :: elem - col - col
enum  :: col - [elem]
 
 By the argument above, one should expect empty to be rejected as having
 an ambiguous type:  Collection elem col = col.   However, we could also
 imagine modifying the definition of ambiguity to take account of the fact
 that, in practice, the value of elem would probably be uniquely determined
 by the value of col, and hence the context in which empty is used could
 still provide enough information to allow the overloading to be resolved.

Indeed. Consider

instance Collection a MyColl where
   empty = MyEmpty
   ...

But this would be nearly useless, since we then know nothing
about the type elem.

So, again, nothing seems to be gained by relaxing the restriction
(unlike what I said yesterday).

Simon





Re: Multi-parameter type classes

1998-07-01 Thread Simon L Peyton Jones

  |5. In the signature of a class operation, every constraint must
  |   mention at least one type variable that is not a class type
  |   variable. Thus:
  ...
  |class C a where
  |   op :: Eq a = (a,b) - (a,b)
  |
  |   is not OK because the constraint (Eq a) mentions on the class type
  |   variable a, and no others.
 
 What's the rationale for this rule?

It's hard to explain (which may well mean my head is on backwards).
In the 5 mins I have today here's what I added to the document


The reason it's awkward to implement is this. Supose you
are type checking the RHS of ttop/tt in an instance declaration for ttC 
[c]/tt,
and suppose you have found that you need the constraint tt(C (S c))/tt.
Should we reduce the constraint, in the hope being able to
"use" the ttEq [c]/tt we have available?  Or should we just
postpone this reduction for the top level of the class decl. 
Usually I make this decision based on whether the constraint mentions any
of the universally quantified type variables (ttb/tt in this case),
which it doesn't.  Without this convenient test I have to either
to eager context reduction (which I don't want to do), or some sort
of search strategy.

Simon





Re: type errors

1998-06-30 Thread Simon L Peyton Jones

 The ghc compiler complains about 2 type errors in the following code:
 
  data SearchResult a = Found a | Fail
 
  class (Eq key, Ord key) = Dictionary dict key dat where
   delete :: key - dict - dict
   search :: key - dict - (key,SearchResult dat,dict)
   searchList :: [key] - dict - ([(key,SearchResult dat)],dict)
 
   searchList [] d = ([],d)
   searchList (x:xs) d = let (sresults,d') = searchList xs d
 (x',sresult,d'') = search x d'
 new_sres = (x',sresult):sresults
 in (new_sres,d'')
 
 the first error:
 
 Class type variable `dat' does not appear in method signature
 delete :: key - dict - dict
 
 Why does ghc expect that I use all of the type variables?
 Obviously I only need
 the key to remove an entry of a dictionary.

You're right.  The restriction is excessive.  Thanks for pointing
this out.  Probably we should only require that at least one
of the class variables is constrained.

Actually I think you would be better off with a class like 
this:

  class (Eq key, Ord key) = Dictionary dict key where
 delete :: key - dict dat - dict dat
 search :: key - dict dat - (key, SearchResult dat, dict dat)
 searchList :: [key] - dict dat - ([(key,SearchResult dat)],dict dat)

That is, you don't need the 'dat' type variable the class at all.

 Ambiguous type variable(s)
 `key', `dict'
 in the constraint `Dictionary dict key a10v'
 arising from use of `searchList' at Dtest2.hs:11
 In an equation for function `searchList':
 searchList (x : xs) d
= let
(sresults, d') = searchList xs d
(x', sresult, d'') = search x d'
new_sres = (x', (sresult)) : sresults
  in (new_sres, (d''))
 In the definition for method `searchList'

You don't say which version of the compiler you are using,
but I think this a palpable bug in 3.01 that is fixed in 3.02.
So you are right to be confused by it!

Simon



Re: type errors

1998-06-30 Thread Simon L Peyton Jones

 The ghc compiler complains about 2 type errors in the following code:
 
  data SearchResult a = Found a | Fail
 
  class (Eq key, Ord key) = Dictionary dict key dat where
   delete :: key - dict - dict
   search :: key - dict - (key,SearchResult dat,dict)
   searchList :: [key] - dict - ([(key,SearchResult dat)],dict)
 
   searchList [] d = ([],d)
   searchList (x:xs) d = let (sresults,d') = searchList xs d
 (x',sresult,d'') = search x d'
 new_sres = (x',sresult):sresults
 in (new_sres,d'')
 
 the first error:
 
 Class type variable `dat' does not appear in method signature
 delete :: key - dict - dict
 
 Why does ghc expect that I use all of the type variables?
 Obviously I only need
 the key to remove an entry of a dictionary.

You're right.  The restriction is excessive.  Thanks for pointing
this out.  Probably we should only require that at least one
of the class variables is constrained.

Actually I think you would be better off with a class like 
this:

  class (Eq key, Ord key) = Dictionary dict key where
 delete :: key - dict dat - dict dat
 search :: key - dict dat - (key, SearchResult dat, dict dat)
 searchList :: [key] - dict dat - ([(key,SearchResult dat)],dict dat)

That is, you don't need the 'dat' type variable the class at all.

 Ambiguous type variable(s)
 `key', `dict'
 in the constraint `Dictionary dict key a10v'
 arising from use of `searchList' at Dtest2.hs:11
 In an equation for function `searchList':
 searchList (x : xs) d
= let
(sresults, d') = searchList xs d
(x', sresult, d'') = search x d'
new_sres = (x', (sresult)) : sresults
  in (new_sres, (d''))
 In the definition for method `searchList'

You don't say which version of the compiler you are using,
but I think this a palpable bug in 3.01 that is fixed in 3.02.
So you are right to be confused by it!

Simon





Re: laziness and functional middleware

1998-06-19 Thread Simon L Peyton Jones


 The paper says:
 "We are working on a distributed implementation of Concurrent Haskell.
 Once nice property of MVars is that they seem relatively easy to implement
 in a distributed setting..."
 I assume that they are not referring to GPH here. 
 (I was surprised that at this statement given what I presume are the
 substantial difficulties of defining a wire format for lazy
 datastructures).
 
 I would assume that the distributed implementation gives the programmer
 some access to whether a machine is "up" or I suppose that you could use
 distributed mvars to design your own ping (cool!)

OK, here's the current story.

* Concurrent Haskell works on uniprocessors only.  It's intended for
  programs that need explicitly-forked, concurrent, I/O-performing
  threads.  Processes are explicitly forked with forkIO, which is
  in the IO monad.

  Results are necessarily non-deterministic -- that's part
  of the point!  There is no attempt to use parallelism to gain
  performance.

* GpH (Glasgow Parallel Haskell) works on a variety of multiprocessors.
  It's specifically intended to harness parallelism to gain performance.

  Parallelism is explicitly sparked with `par`, but the RTS is free to
  ignore such sparks.  `par` is not in the IO monad: it has type
par :: a - b - b

  Results remain completely deterministic even on a multiprocessor.


I'd like to implement a sort of integrated system.  Certainly it
would be nice to be able to make Concurrent Haskell into Distributed
Haskell. To to this MVars would need to work across processors
(not hard), and forkIO would need a variant that said "fork this
process on this other host".  The goal would still not be performance-through-
parallelism; rather, it is part of the *specification* of a distributed
program that it different parts should execute in different places.

Perhaps one could also have some combination of DH and GpH; after
all they use similar underlying mechanisms.

I believe that Kevin Hammond, Phil Trinder and Rita Loogen are planning
to do something like this, but I'm afraid it's not directly in my 
sights!

Einar Karlsson ([EMAIL PROTECTED]) has been doing a 
huge application in Concurrent Haskell so he may have other comments.


I hope that helps to clarify where things are at from the implementation
point of view.  One way to get us hackers to do things is to 
say loud and clear what would actually be useful to you.

Best wishes

Simon





Re: laziness and functional middleware

1998-06-17 Thread Simon L Peyton Jones


Alex,

  main = do  input - getContents
   putStr $ addTwo $ makeLines input
 
  addTwo lines = ask1++(ask2 (Strict x)) ++ (result (Strict y))
  where   x:y:xs = map read lines
  ask1   = "Enter an Integer: "
  ask2 _ = "Enter another Integer: "
  result _ = "Theis sum is "++show (x+y)++"\n"
 
  data (Eval a) = Strict a = Strict !a
  makeLines text = 
   (takeWhile ('\n'/=) text): (makeLines $ tail (dropWhile ('\n'/=) text))
 
 But this code doesn't work.  It prints all the text and then waits for
 input.  Shouldn't laziness guarantee that addTwo doesn't print "Enter
 another integer" until the user enters the first integer?  (that is what
 the file copy example in the tutorial implies) 

What is "Strict".  If you said (x `seq` ask2) instead 
of (ask2 (Strict x)) then you should get the behaviour you expect.
There's a critique of various I/O models in 

%A P Hudak
%A RS Sundaresh
%T On the expressiveness of purely-functional I/O systems
%R YALEU/DCS/RR-665, |DCS|, Yale University
%D March 1989

I don't know whether it got published anywhere.


 For example I am writing an application that handles HTTP transactions 
 and uses a database backend.
 Ideally, I would like to write cgifunctions of type:
 
 myCGIFunction:: [HTTPRequest]-[DatabaseVersion]-
   ([HTTPResponse],[DatabaseChanges])
 
 HTTPRequests come from _middleware_ that recieves http requests from the
 httpd and append them to a list. Attempts to get the next item in the
 HTTPRequest list block until requests come in to fill the list.
 
 DatabaseVersions come from a driver that appends a new txnBegin object
 to a list.
 
 The program would look something like:
 main = do 
   (dbSource,dbSink) - dbConnect "ODBC: someodbcurl"
   (httpSource,httpSink) - apacheConnect "urlToListenOn"
   (httpResponses,dbUpdates) - return $ myCGIFunction httpSource dbSource
   dbSink dbUpdates
   httpSink httpResponses

I gather that HTTPRequests and DataBaseVersions arrive 
asynchronously and unpredictably.  So how does myCGIFunction know
which list to probe first?  Whichever it chooses it may block on
one while there is data on the other. Unless I'm misunderstanding.

You need to be able to say "wait until there's input available
on one of these two ports, and tell me which".  Something like
the Unix select call.  GHC's Posix library has such a thing,
but you can't use it in the way you describe because myCGIFunction
is a pure function on lists.


So far as your dbSink is concerneds, presumably what you have in mind is that
(dbSink dbUpdates) spins off a concurrent process that pulls on dbUpdates and
sends them to the database.  Concurently, httpSink is doing the same.


In short, lots of concurrency and non-determinism.  Fine!  That's
what Concurrent Haskell is for.  You can find a paper saying what
Concurrent Haskell is intended for.
http://www.dcs.gla.ac.uk/~simonpj/papers.html
(under "monads, state, and concurrency").

I hope this helps somewhat.

Simon





Re: Garbage Collection in GreenCard/RedCard/HaskellCOM

1998-06-17 Thread Simon L Peyton Jones

 I just reread Dima's answer to my query about the database access in
 particular and am confused.  Dima says that he can't allow queries
 outside the IOMonad because he has to worry about freeing memory (query
 output).  
 
 However, Haskell/Com (built on top of Greencard?) seems to be able to
 propagate garbage collection information from Haskell to C so that
 when a Haskell/COM Object is no longer in use, there is some functionality
 decrements its reference counter automatically.
 
 How does this work?  and can dima use this mechamism to free queries
 when they are no longer needed? ..allowing database queries outside
 the IOMonad?  

I think all you need is something like this:

query :: String -- The query
  - [Result]   -- The result

query q = unsafePerformIO (do {
server - connect "foo.bax.com";
result - sendQuery server q;
disconnect server;
return result
  })

A use of unsafePerformIO :: IO a - a   is a promise that the
enclosed action is indeed a function.  I.e. the proof obligation
is yours.  That's what you say you want here.

And, yes, the garbage collection should be OK.

Simon







Re: UnHappy 55 tuple [was: Re: PrelTup]

1998-06-16 Thread Simon L Peyton Jones

 Hi Sigbjorn,
 
 I had another look at the 55 tuple problem.  And this time,
 I could find a short program that produces the error.

...

 I had a look at `PrelTup.lhs' and it seems to define tuples
 up to 37 tuples or so.  Maybe this is the problem.
 
 What shall I do?  Hack `PrelTup.lhs'? (Doesn't seem very
 attractive...)

Yes, that's the problem.  I've often thought of making tuples
more generic, but it never seems worth the bother (which is,
belive me, non-trivial).  I think the simplest solution would
be to transform big tuples into nested smaller tuples... but
I havn't gotten around to doing it.  You could do it yourself
of course, or hack PrelTup...

Sorry about this.  I hate it when compilers embody arbitrary limits.

Simon




Re: FW: Exceptions are too return values!

1998-06-16 Thread Simon L Peyton Jones


 I thought about this problem some more, and I have realized that the
 problem of nondeterminacy for Haskell exceptions would in fact be
 considerably worse that I had previously considered.  The trouble is
 that in the general case the problem is not just that the choice of
 which exception is raised is nondeterministic -- instead, it would be
 much worse: the choice of whether you raise an exception or loop
 forever can be also be nondeterministic.  This occurs because of
 expressions such as `0/0 + loop'.  Or, to take a more realistic (and
 nasty) example, `f 0' where `f x = 1/x + g x' where `g x' happens to
 loop if `x' is zero.

I don't agree that this is a problem.  If (g x) loops when x is zero
then you should jolly well test for that:

f x | x == 0= raise "x is zero"
| otherwise = 1/x + g x


I simply don't think it's reasonable to comletely prescribe
the evaluation order of a lazy functional program.

At the moment, Haskell has the fiction that a divide-by-zero
exception and non-termination are the same value, i.e. bottom.
That allows us to say that the behaviour of

f x = 1/x + g x

is identical regardless of whether "+" evaluates its first
argument first or second.  But we all know that the behaviour
in these two cases is quite different: one prints a message and
halts, and the other fails to terminate.  So in this sense
the behaviour of Haskell programs is already non-deterministic.

The nice thing about the NDSet story is that it makes clear
precisely where the non-determinism occurs.  Equational reasoning
is not impaired, nor is the implementation penalised.  I think
it's a great idea.

So I appear to be in disagreement here with Alex, Amr, and Fergus about
the importance of being able to say precisely which exception is raised.
I'm quite content with knowing which *set* of exceptions can be raised.
Ha!

Simon






Re: FW: Exceptions are too return values!

1998-06-11 Thread Simon L Peyton Jones

 I was keeping quiet myself, because I am planning to write
 a paper touching on this topic.  But the cat seems to be
 mostly out of the bag now, so I might as well pipe up.

I'm glad you did.  That's a neat idea.  I'm familiar
with the NDSet idea -- that's in the Hughes/O'Donnell
paper that Kevin cited.  The new thing you add is
using the NDSet for the *exceptions*, rather than
for the "main value".  (It would be hopeless for every function
that could raise an exception to get an NDSet in its result
type, and hence required NDSet ops to manipulate.)  

I'll need to think more about this.  Have you got a paper on
the way?

Simon





Re: FW: Exceptions are too return values!

1998-06-11 Thread Simon L Peyton Jones


 Just to reiterate.  I strongly urge you to ensure consistent exception
 behavior.  As a matter of course, two different compiles should not result
 in two different programs. 

One of the wonderful things about functional languages is that they
do not prescribe the order of evaluation.  To achieve the effect you
want would require us to completely prescribe that order, with
very bad effects on efficiency.  For example, consider

f :: [Int] - Int

Suppose that an analyser figures out that f evaluates every
element of its argument list to produce its result.  Then it
is OK for the producer of the list to evaluate those thunks
right away, rather than building thunks for f to evaluate.

But if we are required to ensure consistent choice of exception
values then we can't do that any more, because the producer
might evaluate the thunks in a different order to f.

This is a big issue for a lazy language.

I really think that the thing to do is leave it unspecified
which exception is chosen.  In practice, only changing the
compiler's optimisation level is likely to change the program's
exception behaviour.

Simon


 As a matter of course, should we assume that these extensions
 (exceptions, existentials) will become part of Haskell or are they just
 part GHC?  Will they be part of Hugs?

Hugs and GHC will be consistent.  Whether it's a feature deemed
worthy of being Officially Incorporated into Haskell is not
something we'll know for a while.  It's much more likely
to be so incorporated if its implemented and found useful, though.








Re: FW: Exceptions are too return values!

1998-06-10 Thread Simon L Peyton Jones


Alastair Reid has been very quiet, so I'll pipe up for him.

Here's a reasonable design for exceptions in Haskell:

* A value of Haskell type T can be
EITHER one of the values we know and love 
   (bottom, or constructor, or function,
depending on T),

OR it can be a set of exceptional values.

* raise :: String - a
  (raise s) returns a single exceptional value, named by string s

* All strict operations (case, +, etc) return the union of
  the exceptional values returned by their strict arguments
  For example, if both arguments to "+" return an exceptional value
  then "+" returns both. Similarly, any strict context.  

* handle :: (String - IO a) - IO a - IO a
  (handle h a) tries to perform the action a.
  If doing so delivers a set of exceptional values then
  apply the exception handler h to the string that names
  one of them.  It is not defined which of the exceptional 
  values is picked.


The neat thing about this is that the exceptions can
be *raised* in arbitrary purely functional code, without
violating referential transparency.  The question of
which exception is chosen is done in the IO monad, where
of course it is allowed to be non-deterministic.
The implementation does not keep sets of exceptional values,
of course.  It simply propagates the first one it trips
over to the nearest enclosing handler.

(It is likely that successive runs will actually give
the same behaviour, but recompiling the program with
(say) different optimisation levels might change the order
of evaluation, and hence change which exception is tripped
over first.)

We're implementing an experimental version of this
in GHC, integrated with the IO monad exceptions, so that

handle :: (IOError - IO a) - IO a - IO a

and we add an extra constructor (UserError String) to the
IOError type for exceptions raised by raise.

Calls to "error" also show up as an exceptional value, of
course.

One merit of the system is that it chops out a tremendous
number of run-time error checks in the IO monad, since
we are now free to implement the mechanism with standard
stack-unwinding techniques.  Result: much better I/O performance.


I'd be interested to know what people think of this.

Simon






Re: circular module imports

1998-06-09 Thread Simon L Peyton Jones


Alex,

If I were you I'd dispense with "deriving(Read,Show)" in module Publisher,
and add an explicit instance for Read/Show on Publisher in PublisherDB.
That would solve your circularity problem.

Haskell does permit mutually recursive modules, but Hugs does not support
them, and GHC requires some help (described in the user manual; basically
you need to give it an "interface file" to get started).

Simon

 I wanted to represent a reasonably small list of Publishers.
 So I created:
 
  module Publisher where
  data Publisher = Publisher { name::String, func::Registration-Bool}
  data Registration = 
  Registration { data::String, pub::Publisher} deriving (Read,Show)
 
 Notice that the Publisher data type includes functions so I can't just
 store these publishers as tuples in a database.  Instead I store
 information on each publisher in a file e.g. Yahoo.lhs CNN.lhs
 DoubleClick.lhs
 
  module Yahoo where
  import Publisher
  Yahoo = Publisher "Yahoo" (\x - x)
 
 Another file, PublisherDB.lhs, imports each of these 
 
  module PublisherDB where
  import Yahoo
  import DoubleClick
  publishers = [("Yahoo",Yahoo),("DoubleClick",DoubleClick)]
  getPublishers::String - Publisher
  getPublisher name = lookup publishers name
 
 The trouble I have is that Registration requires me to define
 read/show for Publisher and read/show requires access to getPublisher
 which leads to a circular import problem:
 * getPublisher needs to be defined in PublisherDB because it requires a
 complete list of Publishers.  
 * PublisherDB imports each individual publisher (e.g. Yahoo, DoubleClick)
 Individual publishers import Publisher
 
 For literate programming reasons, as well as general program modularity
 reasons, I would prefer not to merge all of these into a single large
 file.
 
 Is there another way?
 
 
 -Alex-
 
 ___
 S. Alexander Jacobson i2x Media  
 1-212-697-0184 voice  1-212-697-1427 fax






Re: classes and instances

1998-06-08 Thread Simon L Peyton Jones


 data Weirder a b = Weirdest a b
 
 class Weird c where
 f1 :: c - c
 f2 :: Weirder a b - c - Weirder a b
 f3 :: Weirder a c - Weirder c a
 f4 :: c - c - Bool
 
 instance Weird (d,e) where
 f1 (x,y) = (x,y)
 f2 w (x,y) = Weirdest x y
 f3 (Weirdest x y) = Weirdest y x
 f4 (x,y) z = True
 
 The complaint is:
 
 ERROR "x.hs" (line 11): Declared type too general
 *** Expression: f2
 *** Declared type : Weirder a b - (c,d) - Weirder a b
 *** Inferred type : Weirder a b - (a,b) - Weirder a b
 
 It is complaining that in the class, the type of the second
 parameter to function f2 has a simple type "(c,d)", which is unrelated
 to the type of the arguments to the constructor Weirdest, namely a and b.

That's exactly the complaint.  

 In the instance, the type of the second argument to f2 must be the same
 types as the arguments to the constructor Weirdest. But this is exactly what
 I want, I want the function in the instance to be valid over a more specific
 type than in the class. This seems to have worked for f1, f3 and f4, but
 not for f2. What am I missing here? Is there a way to define this?

In f1, f3, f4, the way in which the instance is "more specific" is that a type
variable (c, in the class decl) is instantiated by a type.  So for f1, the 
class decl
said
f1 :: c - c
and the instance decl instantiated c to (d,e), giving the type
f1 :: (d,e) - (d,e)
And that is indeed the type of the f1 you give in the instance decl.  Ditto 
f3, f4.
But for f2, if we instantiate "c" we get

f2 :: Weirder a b - (d,e) - Wierder a b

You correctly say that you want a still more specific type.   Without
understanding your application better I am nervous about recommending
this, but you *can* get what you say you want by using a multi-parameter class

class Weird a b c where
   ...
  f2 :: Weirder a b - c - Wierder a b
  ...

and now use the instance

instance Weird d e  (d,e) where
 f2 w (x,y) = Weirdest x y

But multi-parameter type classes aren't a panacea.  It may be that there's a 
more direct
way to achieve what you really want.

Simon







Re: data or class inheritance

1998-06-05 Thread Simon L Peyton Jones


 I have a base class,Organization, with name and address functions.
 I want classes Buyer and Seller from Organization.
 
 Now if I want to create an 2 instances of Seller
  data Yahoo = Yahoo
  instance Organization Yahoo where
   name _= "Yahoo"
   addreess = ...
 
  data DoubleClick= DoubleClick
  instance Organization DoubleClick where
   name _ = "DoubleClick"
   ...
 
 Why is [Yahoo,DoubleClick] illegal?  

It's illegal because we can't give it a type.  It hasn't got
type [Yahoo], nor [DoubleClick].  What you want is to say 
"it's a list of things in class Organization, and that's all I will
ever ask of them.  In hbc (and soon in GHC) you can say:

data OrganizationThing = forall a. Organisation a = MkOrg a

(Some people would write "exists" instead of "forall", and that's
reasonable.  But the type of MkOrg really is

MkOrg :: forall a. Organization a = a - OrganizationThing
)

Notice that OrganizationThing isn't parameterised.

Now you can say

foo :: [OrganizationThing]
foo = [MkOrg Yahoo, MkOrg DoubleClick]

Further, you can say, for example

toName :: OrganizationThing - String
toName (MkOrg o) = name o

In effect, an OrganizationThing is anything that in in class 
Organization. The MkOrg constructor is a bit annoying (like newtype)
but no more.


Does that help?  I'm about to put existentials into GHC.  It's
been delayed while I ripped its entrails out; adding existentials
and then keeping types throughout the compiler caused a non-trivial
change that I took advantage of to perform a heart-liver-and-lung
transplant.

Simon






Re: cycles in class definitions

1998-05-29 Thread Simon L Peyton Jones


  class Fallible m where
fail_  :: String - m a
rethrow:: Fallible n = m a - n b
 
 GHC 2.10 (solaris) complains about this:
 
  Cycle.lhs:2: Cycle in class declarations ... `Fallible' Cycle.lhs:4
 
 
 Whereas Hugs (jan98) is fine with it.
 
 I didn't see anything in the 1.4 report which forbids such declarations,
 and haven't found anything in the ghc docs. 
 
 So, which system is doing the right thing? 

I think Hugs is.  I had better fix this.

Simon





Re: order of evalutation of ||

1998-05-29 Thread Simon L Peyton Jones

 
 If you have a statement like:
 
 result= a || b || c 
 
 does Haskell guarantee that a gets evaluated before b?
 If it does then I only have to protect against pattern match failure in
 one place, a.

Yes; if a is true, b and c won't be evaluated.  That's part of
the defn of ||

Simon





Re: C to Haskell

1998-05-12 Thread Simon L Peyton Jones

 Greencard allows Haskell to call C (or Corba).  Is there a way to give C
 code access to Haskell functions?

GHC does not yet allow this, but we are working hard on H/Direct,
a successor to Greencard, that will.  It'll also allow you to
seal up Haskell programs inside COM objects.  Timescale: a month
or two rather than a day or two.

Simon





Re: doubles-troubles

1998-05-12 Thread Simon L Peyton Jones


 rigid and I belong to the small legion of amateurs who implemented their
 own math. domain system, Rings, Fields, Modules, etc. This apparently
 has no chance to be included into the Haskell standard, nobody cares.

Standards develop because people who care about particular aspects
of them push them forward.  It is not true that nobody cares.  It may well
be true that nobody has time to make a well-documeted and well-implemented
library.  You could change that.  [I mean you plural; readers of this list.]

To be specific, why don't you package it all up as a documented Haskell
library? We'd be happy to distribute it along with GHC, and I bet the 
Hugs team would too.  

After a while, if people liked it, it would become so popular it
would become part of the standard.

I'm serious.  We are busily developing libraries shared between GHC
and Hugs that aren't part of the official standard, but are at least
common between our two impls, and available to all other impls too.
ftp://haskell.org/pub/reid/libs980219/libs.html

Send us your code!

Simon





Re: Pattern Match Success Changes Types

1998-05-11 Thread Simon L Peyton Jones


Yes, GHC does some CSE stuff, but not very much. I don't think it has a large
performance impact, but (as luck would have it) but I plan to work on it a bit
in the newt few months.

My advice would be: write clear code, and let the compiler do the
CSE.  If it doesn't, complain to the compiler writers.   You have
good reason to believe that it should.

The exception is the case you've been discussing.  The type of Right
is not

   Right :: b - Either _ b

(I don't know what "_" is) but rather

   Right :: forall a,b. b - Either a b

Since GHC keeps the types right through the compiler, it
really can't do CSE on two terms of type

Either Int  Int
Either Bool Int

even if they are both applications of Right.

Actually, GHC does finally discard type information right at the
end, so we could do an extra bit of CSE there, but frankly I doubt
it would buy very much.  But I'm willing to stand corrected.

Incidentally, I don't think it would be sensible to change
the type system to allow the 

 demo1 :: (a - b) - Either a c - Either b c
 demo1 f (Left  a)   = Left (f a)
 demo1 _ r@(Right c) = r

What type does r have?  Either a c.
What type does the result of the fn have?  Either b c.
Different types.  It would be hard (I believe) to specify crisply
when it was legitimate for two terms with different types to
be as'd together.

Simon





Re: Binary, Conversions, and CGI

1998-05-06 Thread Simon L Peyton Jones


 To the newcomer who is not part of the FP academic community, this all
 makes life sort of difficult.   These differences seem larger than the
 differences among C compilers and are MUCH larger than the differences
 among Java compilers.   I have been trying to learn Haskell and have been
 impressed with both its elegance and the way it allows me to write code
 that works on the first try (or two).

These are fair comments.  In my view the things that are holding back
Haskell are not language issues at all.  They are

- foreign language interfaces
- libraries
- programming environments

There are a bunch of people working on all of these, but the results
are not yet stable.  In particular

* On foreign language interfaces, we have now been through several iterations
  of Green Card, and are poised for what I hope is our final "take",
  currently called H/Direct.  There's a paper on my Web page, and an
  implementation on the way.  http://www.dcs.gla.ac.uk/~simonpj
  [I say "we" meaning "Glasgow plus many
  colleagues", notably Yale, York, Utrecht.]

  Foreign language interface implementations have certainly been a 
  moving target.  It would be better to get it "right" first time,
  but worse to have "standardised" prematurely.

* On libraries, there is quite a lot going on.  GHC and Hugs have some
  more or less standardised libraries going well beyond the Haskell 1.4
  set.  The Binary library is a hot topic.  E.g. Glasgow and York had
  a full-day meeting on this very topic two weeks ago; that should lead
  to a library proposal (based on the current York story) 
http://www.cs.york.ac.uk/fp/nhc13/binary.html
  for the world to discuss/critique, and thence to implementations
  for GHC and Hugs as well as NHC.

  As ever, these things take time.

* On programming environments, we are working hard on integrating GHC
  and Hugs.  It's coming, promise.


Of course, all this just describes stuff I know about.  There are probably
other people beavering away on good stuff; please sing out!


 However, I am not a researcher.  I do commercial software development and
 need some documentation and stability.  I understand that everything here
 is fairly new, but it would be really usefl if someone would post a
 summary of the FP community politics for those of us outside the research
 community:
 * What verions of Haskell should someone outside the research community be
 using.  e.g. is GHC ver x particularly experimental?  Is Haskell as a
 whole really a research language and not ready for commercial
   (e.g. high intensity server based) applications?

Hmm.  GHC 2.10 is rather solid.  GHC 3.01 is more experimental.
http://www.dcs.gla.ac.uk/fp/software/ghc/

I would love for Haskell to be used for high-intensity applications,
but before betting my company on it I'd protype my application.
Our research priorities these days are heavily influenced by
what people say they need; so talk to us.

 * Any particular language features that we should avoid or that are
 likely to change?

In Haskell 1.4 the main proposed simplification is that [x | x-xs]
will mean a *list* comprehension rather than a *monad* comprehension.
I think the libraries are much more volatile than the language.

 * Who "owns" haskell?  How are decisions made about what goes in/out of
 the language? Who controls the content of Haskell.org?

The whole thing is pretty informal.  There no "official" standardisation
body etc. Instead, there is a working group chaired by John Hughes,
which is trying to develop "Standard Haskell".  More info at
http://www.cs.chalmers.se/~rjmh/Haskell

The discussion is readable by all, but only writable by the working
group.  This is admittedly only semi-democratic; but it is a way 
of making progress.

As Will says, I think the whole scene is pretty exciting. The Haskell community
works together well; and the focus is increasingly on making a *usable*
language. The main difficulty is simple lack of available effort: anyone who
wants to contribute to any of these things, speak up!

All of this reflects my own personal views; I don't claim to speak
for anyone else.

I hope this helps.

Simon






Re: Instance declaration superclasses

1998-04-27 Thread Simon L Peyton Jones

  GHC complains about:
 
   class (Monad m, {-, Monad (t m)-}) = MonadT t m where
 lift :: m a - (t m) a
  
   instance (Monad m) = Monad (EnvT env m) where
 ...
  instance Monad m = MonadT (EnvT env) m where ...
 
 
 Even when I use the "decent" definition you suggest, GHC duplicates the
 context:
 
 8 $d2 _:_ _forall_ [a b :: (* - *)] {PrelBase.Monad b, PrelBase.Monad b} =
 (MonadT (EnvT a) b) ;;

Aha!  A palpable bug!

GHC implements an optimisation suggested by Mark. Consider

class S a = C a where ...

instance D a = D [a] where ...

The instance declaration gives rise to a function

$d :: D a - D [a]

that takes a dictionary for (D a) into one for (D [a]).  The optimisation
is to make $d do this:

$d :: D a - S [a] - D [a]

that is, to pass a dictionary for the superclass too.  That can increase
sharing of dictionaries.

Ihe single parameter case, it's impossible for D a and S [a] to be
the same.  But in this multi-parameter case they can, and the dups
aren't eliminiated.  Hence the error.


I don't see a workaround.  I'll just have to fix it...
How urgent is it?

Simon




Re: Instance declaration superclasses

1998-04-27 Thread Simon L Peyton Jones

 (This is related to my previous post concerning monad transformers.)
 
 GHC complains about:
 
  class (Monad m, {-, Monad (t m)-}) = MonadT t m where
lift :: m a - (t m) a
 
  instance (Monad m) = Monad (EnvT env m) where
...
 
  instance (Monad (EnvT env m)) = MonadT (EnvT env) m where
...
 
 saying that it cannot deduce (Monad m) to satisfy the second instance
 declaration.  But it should be able to pick up that fact from the first
 instance declaration, shouldn't it?
 
 Indeed, if I add (Monad m) to the context explicitly:
 
  instance (Monad m, Monad (EnvT env m)) = MonadT (EnvT env) m where
...
 
 then GHC gives a warning when I import this module because the assumption
 appears twice in a class context in the .hi file:
 
   Duplicated class assertion `Monad b' in the context:
 (Monad b, Monad Env.EnvT a b, Monad b)


OK, there are two things going on.  First, GHC is meant to implement
the restriction that the context in an instance decl can constrain
only type variables (Choice 6b).  So this should be illegal:

instance (Monad (EnvT env m)) = MonadT (EnvT env) m where ...

Instead, since Monad (EnvT env m) holds if Monad m holds, you should
write

instance Monad m = MonadT (EnvT env) m where ...

The complaint on import is because GHC didn't realise that it
hadn't checked for a decent instance context, and thereby generated
a bogus one in the interface file.


Choice 6b restricts instance decls so that context reduction is
guaranteed to terminate.  As ever, I'm interested in cases where
this prevents useful programs.  This time, there's no problem,
I think

Simon




Re: MPTC

1998-04-24 Thread Simon L Peyton Jones


 There's no description of the multi-parameter type classes extension in the
 3.01 user manual; are the extensions implemented as described in the link
 below?  (I vaguely recollect Simon posting a message on this subject more
 recently, but I couldn't find it.)
 
 http://www.cs.chalmers.se/~rjmh/Haskell/Messages/Display.cgi?id=171

Yes; except that -fallow-overlapping-instances does what it sounds
like. It will permit overlapping instances provided one completely
overlaps the other; e.g instance C [a] and instance C [Int].  

But instanc C (a,Int) and instance C (Int,b) are not legal; becuase then we
would not know which instance to use for C (Int,Int).

Choice 7a
~
The context in a class declaration (which introduces superclasses)
must constrain only type variables.  For example, this is legal:

class (Foo a b, Baz b b) = Flob a b where...
but not
class (Foo [a] b, Baz (a,b) b) = Flob a b where ...

 I tripped across an example from the literature that might motivate relaxing
 this.  It's the monad transformer class from the monadic modular
 interpreters paper:
 
 class (Monad f, Monad (t f)) = MonadT t f where
   lift :: f a - t f a

Thanks for the reminder about this.  Examples of where Choice 7a is
a really bad thing are still welcome!

Simon




Re: binary search

1998-04-17 Thread Simon L Peyton Jones


 Not to reject assertions (they would be welcome), but I think that you
 need something slightly different in a functional programming language.
 
 Assertions in procedural languages typically define system state before
 and after a particular function gets executed.
 
 State assertions are less appropriate to functional programming languages
 (except in the context of the content of monads but that is a separate
 issue).  As I understand it, Haskell programmers should use the type system to 
prevent
 functions from getting called w/ operands outside their domain.
 For example, a very careful programmer would specify that division is only
 allowed on a type that has already filtered out 0 
 e.g.  
  newType NonZero :: MakeNonZero Num
  toNonZero:: Num - NonZero
  toNonZero x | x==0 = error "Shouldn't be zero"  
  | otherwise = MakeNonZero x

Using the type system is often entirely impractical, because
the things you want to check may not be statically checkable, at
least not within the computational language the type system provides.

Inside GHC we make extensive use of assertions.  In the example you give
we might say

divide :: Int - Int - Int
divide a b = ASSERT( b /= 0 ) a/b

The ASSERT expands to

if not (b /= 0) then error "Assertion failed on line 27 of Foo.hs"
else a/b

In Haskell 2 I think assertions will be standard.  (So far we have
used the C preprocessor to add them, but they should jolly well be
in the language.)

It's *very* important to be able to turn all assertion-checking off, easily.
Then programmers will write expensive assertion checks 
ASSERT( length xs  100 )
ASSERT( isBalanced tree )
which they would never write if they thought these checks formed
a permanent part of their code.

In our experience assertions are quite a potent debugging tool.
Note that they can check post-conditions as well as pre-conditions:

f x y = ASSERT( pre x y  post x y r ) r
  where
r = ...body of f...

Simon





Re: binary search

1998-04-16 Thread Simon L Peyton Jones


 2. how would I have found/fixed such an error in a more complex function
   w/o assertions and w/o print statements?

Good questions

There was a proposal to put assertions into Std Haskell, which we
have implemented in GHC.  (I'm not sure we've yet put that version out
though.)  So assertions are definitely coming. You are right that they
are important.

Finding infinite loops in functional programs is quite hard.  Colin
Runciman and Jan Sparud have been working on "redex trails".  We
plan to help at least identify where the loop is by putting out
cost-centre-stack information... but that's still in the works.

Simon





Re: Multiple Parameter Class in Hugs -- Please!

1998-04-06 Thread Simon L Peyton Jones


  infixl 7 *$
  infixl 6 +$, -$
  class Ring a where
   (+$), (-$), (*$) :: a - a - a
   negateR :: a - a
   fromIntegerR :: Integer - a
   zeroR, oneR :: a
 
 It's particularly irritating having to use many of the Num methods and   
 therefore having to give them different names.

Suggestion: use qualified imports:

module Structures where
import Prelude hiding( Num(..) )

  class Ring a where
(+) :: a - a - a 
...etc...

  instance Ring Integer where
(+) = (Prelude.+)
...etc...

When you import Structures elsewhere you must either import it
qualified, or else hide Num(..) from the Prelude, so that "+"
always means exactly one thing.

I'n not sure whether this module jiggery pokery works in Hugs,
but it does in GHC.

Simon





Re: Binary files in Haskell

1998-03-11 Thread Simon L Peyton Jones


 Real world example: development tools process a large geometric data set to
 build a run-time optimized BSP tree with precalculated lighting and
 collision information.  The user application will not modify this data, but
 it will have to load it dynamically without slowing down a 30Hz
 graphics/interaction loop. (Apologies to Quake and Mario 64 :)
 
 Solution in C: load the data as contiguous raw bytes and cast pointers into
 the block to the appropriate structure types.
 
 Solution in Haskell?

Malcolm Wallace and Colin Runciman at York have been working
on this kind of thing.  They had a paper at the Haskell workshop
last year (which is on Malcolm's home page 
http://www.cs.york.ac.uk/~malcolm/).  I believe they are near to
releasing a revised version.  

It only works for nhc at the moment, but if enough people like you yell loudly
enough (construe this positively!) then we'll put it into GHC/Hugs too.
Our priorities are largely driven by what people ask for.

Simon






Re: type tags rejected

1998-03-02 Thread Simon L Peyton Jones


 My problem is about tagging haskell expressions with the types I want
 them to have. What is the compiler error "Can't for-all tye type
 variable(s) `wef' in the inferred type `CONS ei f7t34 wef'" about? I
 mean, when I get it, how can I get more information about what went
 wrong? (I am thinking of something like a compiler switch that makes
 ghc produce a file where every expression the input code is tagged
 with the inferred type.)

Here is a simpler version of your difficulty:

f (x:xs) y = g y
   where
 g :: a - [a]
 g q = [x,q]

The type signature for g is wrong.  Why?  Because it is short
for
g :: forall a. a - [a]

[A Haskell's type signature *always* has a forall for all the
type variables it mentions.]

But g mentions "x", so it's just not true that g works for all
types a; it only works for those of x's type.  In fact
Haskell as it now stands makes it impossible for you to write
the type signature for g.  That's something that'll be fixed in 
Standard Haskell. The only question is what syntax to use.  Here's
the one I favour

f :: [a] - [a]
f (x:xs) y = g y
   where
 g :: a - [a]
 g q = [x,q]

The type signature for "f" brings the type variable "a" into
scope in the body of "f".  So now the type signature for "g" 
doesn't have a forall, as you'd expect.

Simon





Re: Ambiguity

1998-02-25 Thread Simon L Peyton Jones


 Except that this isn't actually allowed because happyReduce_1 appears
 in a restricted binding group, and hence can't have any context in its
 type.  So, at this point, Hugs complains.  GHC, I assume, just assigns
 hugsReduce_1 a monomorphic type, only to find at some later point that
 the list instance of Functor (and only that instance) are required to
 make things type check.
 
 Hugs is perhaps too eager in complaining, but it can't assign anything
 other than a fully polymorphic type to top-level terms, so it doesn't
 really have much choice.  For now, the simplest fix would seem to be to
 use a non-overloaded map function.

... or to give an argument to happyReduce_1, which would
mean it wasn't restricted any more...

I think that's an accurate diagnosis. The interesting questions are:

a) which compiler is implementing the language definition?
b) is the langauge definition "right"?

A quick look at 4.5.5 of the 1.4 report suggests that GHC
is implementing the language definition and Hugs is not.
The monomorphism restriction simply says you can't generalise
that type variable.  The report gives examples that make
it clear that you are expected to be able to use the monomorphic
definition at any one type in the module.

Whether this "monomorphism restriction" is the "right" language spec is hotly
debated. But it seems fine to me.

Simon





Multi-parameter type classes in GHC 3.01

1998-02-25 Thread Simon L Peyton Jones


 PS. Could somebody inform me what is the current status of
 multi-parametric classes?


GHC 3.01 supports multi-parameter type classes
in more or less the form described in the last section of
"Type classes: an exploration of the design space"
(http://www.dcs.gla.ac.uk/~simonpj/multi.ps.gz)


We only mentioned this eliptically in the announcement, but
it should really have been a headline!  (The reason is that
we'd previously released 3.0 to friends, so we'd forgotten
that not everyone knew..)

It now seems very likely that Standard Haskell will support
multi-parameter type classes, and GHC 3.01 is our attempt
to gather feedback about whether we are moving towards
the right set of design choices.

The main limitation at the moment is

no overlapping instance declarations

There are ways to lift this restriction, but it's a conservative
first choice.

Please try it out, and let us know if you trip over
(a) bugs
(b) obscure error messages
(c) design restrictions that cramp your programming style

The URL is

http://www.dcs.gla.ac.uk/fp/software/ghc/

Simon





Want a job?

1998-02-24 Thread Simon L Peyton Jones


I'd be delighted if a programming-language-aware person applied for this
(tenured) post.  Deadline 13 March.

Simon

Lectureship in Computing Science
University of Glasgow

The University invites applications for a permanent lectureship in the
Department of Computing Science.

The Department has demonstrated the highest standards in both teaching and
research.  We are actively developing new courses, at both undergraduate
and postgraduate levels, and have a thriving research community.
Applicants must possess the experience, enthusiasm and ability to enhance a
vigorous research environment, together with the creativity and energy to
contribute to our educational excellence.

Applicants will be considered from any area of computing science, but
preference will be given to those who match our teaching commitments and
who complement or strengthen our research interests.

The appointment will be on the Lecturer A or B scale (?16,045??27,985
p.a.), at a point determined by age, experience, and ability.

Informal enquiries may be made to the Head of Department, Mr Ray Welland,
e-mail [EMAIL PROTECTED]  Information about the Department and further
information about this post may be found on WWW at
http://www.dcs.gla.ac.uk/announce/.

Further particulars may be obtained from the Personnel Office, University
of Glasgow, Glasgow G12 8QQ, Scotland (tel. 0141 330 6094, fax 0141 330
4921).  Applications (eight copies from UK applicants, one copy from
overseas applicants) must be sent to this address by 13 March 1998.  Each
application must consist of: a curriculum vitae, including a list of
publications and a covering letter explaining why you wish to be considered
for the position, a brief statement on the state of your health and details
of any period of probation served elsewhere and any relevant training.  You
should also submit a personal information form (available from our Web
site) including the names and addresses of three referees.  In all
correspondence please quote reference number 060/98AL.








Ambiguity

1998-02-23 Thread Simon L Peyton Jones


[Comment to Simon Marlow:
Does happyReduce_1 really need *eight* type parameters?
Why not output type signatures?
]

I respectfully suggest that the enclosed is a bug in Hugs.  
GHC gets the following type for happyReduce_1, which I think
is the correct type:

happyReduce_1 _:_ _forall_ [a b c d e f g]
= PrelBase.Int
- a
- b
- [HappyState a ([HappyAbsSyn (e - PrelBase.Double) 
   [(d, e - f)]
   g
  ] - c
 )
   ]
- [HappyAbsSyn (e - PrelBase.Double) [(d, e - f)] g]
- c ;;

This is enough to deduce that
the second type parameter for HappyAbsSyn is "list of something"
and hence enough to deduce which instance of Functor is needed.

I'm not surprised thad making action_0 self-recursive
makes the program really ambiguous, because we lose the type clue
in happyReduce_2 that the second type parameter is a list.

Simon


=
Forwarded Message

Date:Fri, 13 Feb 98 17:13:41 +0100
From:Jon Mountjoy [EMAIL PROTECTED]
To:  Haskell Bugs [EMAIL PROTECTED]
Subject: Typing (Hugs/GHC)

Hello,

While playing with Happy I managed to generate a Haskell program
which compiles fine under ghc but not under Hugs.  I don't know which
one is the culprit

In Hugs(January 1998), one gets

 ERROR "hugs.hs" (line 32): Unresolved top-level overloading
 *** Binding : happyReduce_1
 *** Outstanding context : Functor b

where line 32 is the one marked -- ##

It compiles in ghc-3.00.  Changing very small things, like the
line marked --- to 
  action_0 (6) = happyShift action_0---

then makes ghc produce a similar message:

   hugs.hs:37:
   Cannot resolve the ambiguous context (Functor a1Ab)
   `Functor a1Ab' arising from use of `reduction', at hugs.hs:37

I am afraid that I could not make a smaller program give me this
behaviourperhaps this should be copied to hugs bugs as well?

Since the programs don't use any fancy type classes etc., one would
hope that either both compilers would accept or reject the program.

Jon
- --

- -- parser produced by Happy Version 1.5


data HappyAbsSyn t1 t2 t3
= HappyTerminal Token
| HappyErrorToken Int
| HappyAbsSyn1 t1
| HappyAbsSyn2 t2
| HappyAbsSyn3 t3

action_0 (6) = happyShift action_3---*
action_0 (1) = happyGoto action_1
action_0 (2) = happyGoto action_2
action_0 _ = happyFail

action_1 (7) = happyAccept
action_1 _ = happyFail

action_2 _ = happyReduce_1

action_3 (5) = happyShift action_4
action_3 _ = happyFail

action_4 (4) = happyShift action_6
action_4 (3) = happyGoto action_5
action_4 _ = happyFail

action_5 _ = happyReduce_2

action_6 _ = happyReduce_3

happyReduce_1 = happySpecReduce_1 1 reduction where {-- ##
  reduction
(HappyAbsSyn2  happy_var_1)
 =  HappyAbsSyn1
 (\p - let q = map (\(x,y) - (x,y p)) happy_var_1 in  (10.1))
;
  reduction _  = notHappyAtAll }

happyReduce_2 = happySpecReduce_3 2 reduction where {
  reduction
(HappyAbsSyn3  happy_var_3)
_
(HappyTerminal (TokenVar happy_var_1))
 =  HappyAbsSyn2
 ([(happy_var_1,happy_var_3)]);
  reduction _ _ _  = notHappyAtAll }

happyReduce_3 = happySpecReduce_1 3 reduction where {
  reduction
(HappyTerminal (TokenInt happy_var_1))
 =  HappyAbsSyn3
 (\p - happy_var_1);
  reduction _  = notHappyAtAll }

happyNewToken action sts stk [] =
action 7 7 (error "reading EOF!") (HappyState action) sts stk []

happyNewToken action sts stk (tk:tks) =
let cont i = action i i tk (HappyState action) sts stk tks in
case tk of {
TokenInt happy_dollar_dollar - cont 4;
TokenEq - cont 5;
TokenVar happy_dollar_dollar - cont 6;
}

happyThen = \m k - k m
happyReturn = \a tks - a
myparser = happyParse



happyError ::[Token] - a
happyError _ = error "Parse error\n"

- --Here are our tokens
data Token  = 
  TokenInt Int
| TokenVar String
| TokenEq
deriving Show

main = print (myparser [] [])
- -- $Id: HappyTemplate,v 1.8 1997/12/04 15:07:21 simonm Exp $

{-
The stack is in the following order throughout the parse:

i   current token number
j   another copy of this to avoid messing with the stack
tk  current token semantic value
st  current state
sts state stack
stk semantic stack
- -}

- -

happyParse = happyNewToken action_0 [] []

- -- All this HappyState stuff is simply because we can't have recursive
- -- 

Re: Binary files in Haskell

1998-02-23 Thread Simon L Peyton Jones


 I would like to use Haskell for several larger scale projects, but I
 can't figure out how to read and write binary data.  It does not appear
 that the language supports binary files.  Am I missing something?

Colin Runciman and his Merrie Men are working on writing
Haskell values into binary files with the additional feature
that the files are compressed.  There was a paper in the '97 Haskell
workshop and I know he's been working on it since; but I don't
think a "product" has emerged for general use.  But with encouragement
perhaps it will!

Simon





Re: Confusing error message

1998-02-17 Thread Simon L Peyton Jones

 
 I encountered a confusing error message, which you can
 reproduce with 
 
   type P a = Maybe a
 
   instance Monad P where
 (=)  = error "foo"
 return = error "bar"
 
 I get 
 
   bug.hs:5: `P' should have 1 argument, but has been given 0 .

Would it be better if it said 

Type synonym constructor P should have 1 argument,
but has been given 0

Haskell requires that type synonyms are never partially applied;
that's what's being complained about here.

If you did fully apply it, GHC 3.1 (without -fglasow-exts) would
then complain about making an instance of a type synonym.
At the moment, though, it trips over the mal-formed type expression first.

Does that make sense?  Any suggestions for improving the error
message in a way that would have made sense to you at the time?

Simon





Re: ghc-3.00-linux bug in linking

1998-02-11 Thread Simon L Peyton Jones


Sergey, Marc

I've had a look at your Docon thing.  

It's very impressive.  But I'm amazed it compiles at all!
When I fixed the makefile it showed up *all sorts* of loops.
For example, IParse_ imports DPrelude, and DPrelude imports IParse_

Getting these mutually recursive modules to work reliably involves
breaking the loop with {-# SOURCE #-} imports, and writing an .hi-boot
file by hand to break the loop.

Given this I'm not at all surprised that you get strange behaviour.

The .hi-boot stuff is not well documented. This might be a good opportunity
to document it better -- but meanwhile I suggest you have a look at
your modules to see whether you can reduce the number of loops.  Can you do 
that?

Simon

solander:~/docon  gmake export/DPrelude.o
gmake export/DPrelude.o
gmake: Circular export/DPrelude.hi - export/DPrelude.o dependency dropped.
gmake: Circular export/SetGroup.o - export/Common_.hi dependency dropped.
gmake: Circular export/SetGroup.o - export/Common__.hi dependency dropped.
gmake: Circular export/Semigr_.o - export/Common_.hi dependency dropped.
gmake: Circular export/Semigr_.o - export/SetGroup.hi dependency dropped.
gmake: Circular export/Semigr_.o - export/Common__.hi dependency dropped.
gmake: Circular export/SetGroup.o - export/Group_.hi dependency dropped.
gmake: Circular export/Common__.o - export/Group_.hi dependency dropped.
gmake: Circular export/Ring_.o - export/Common__.hi dependency dropped.
gmake: Circular export/Ring_.o - export/RingModule.hi dependency dropped.
gmake: Circular export/Module_.o - export/RingModule.hi dependency dropped.
gmake: Circular export/VecMatr.o - export/Common_.hi dependency dropped.
gmake: Circular export/VecMatr.o - export/RingModule.hi dependency dropped.
gmake: Circular export/VecMatr1.o - export/VecMatr.hi dependency dropped.
gmake: Circular export/Char_.o - export/Common_.hi dependency dropped.
gmake: Circular export/List_.o - export/Common_.hi dependency dropped.
gmake: Circular export/Common1_.o - export/RingModule.hi dependency dropped.
gmake: Circular export/VecMatr1.o - export/RingModule.hi dependency dropped.
gmake: Circular export/Matr_.o - export/RingModule.hi dependency dropped.
gmake: Circular export/Matr_.o - export/VecMatr.hi dependency dropped.
gmake: Circular export/Matr1_.o - export/RingModule.hi dependency dropped.
gmake: Circular export/Matr1_.o - export/Matr_.hi dependency dropped.
gmake: Circular export/PP_.o - export/Common__.hi dependency dropped.
gmake: Circular export/DInteger.o - export/Common_.hi dependency dropped.
gmake: Circular export/DInteger.o - export/Common__.hi dependency dropped.
gmake: Circular export/DInteger.o - export/RingModule.hi dependency dropped.
gmake: Circular export/DInt_.o - export/DInteger.hi dependency dropped.
gmake: Circular export/DInt_.o - export/Common_.hi dependency dropped.
gmake: Circular export/DInt_.o - export/RingModule.hi dependency dropped.
gmake: Circular export/DInt_.o - export/Common__.hi dependency dropped.
gmake: Circular export/Int_.o - export/DInteger.hi dependency dropped.
gmake: Circular export/Int_.o - export/Common__.hi dependency dropped.
gmake: Circular export/Int_.o - export/Group_.hi dependency dropped.
gmake: Circular export/Int_.o - export/RingModule.hi dependency dropped.
gmake: Circular export/SR_.o - export/Common_.hi dependency dropped.
gmake: Circular export/SR_.o - export/RingModule.hi dependency dropped.
gmake: Circular export/ResEuc_.o - export/RingModule.hi dependency dropped.
gmake: Circular export/ResEuc_.o - export/SR_.hi dependency dropped.
gmake: Circular export/ResEuc1_.o - export/RingModule.hi dependency dropped.
gmake: Circular export/ResEuc1_.o - export/ResEuc_.hi dependency dropped.
gmake: Circular export/DInt_.o - export/Common__.hi dependency dropped.
gmake: Circular export/DInteger.o - export/Common__.hi dependency dropped.
gmake: Circular export/PP_.o - export/RingModule.hi dependency dropped.
gmake: Circular export/Det_.o - export/LinAlg.hi dependency dropped.
gmake: Circular export/Todiag_.o - export/LinAlg.hi dependency dropped.
gmake: Circular export/Solvelin_.o - export/LinAlg.hi dependency dropped.
solander:~/docon  





Re: GHC 3.00 Error when introducing an ErrorMonad

1998-02-06 Thread Simon L Peyton Jones


 While experimenting with multiple parameter type classes, I
 introduced an ErrorMonad. Compiling this with GHC 3.00, the 
 following happened:
 
 Computation.hs:16: Warning:
 `ErrorMonad' mentioned twice in export list

I'll look into this.

 panic! (the `impossible' happened):
   zonkTcType

Now this *really* shouldn't happen.It turned out to be a bug
in the type-error recovery code.

Thanks for the fine report.

S




Re: bug report 3.00

1998-02-06 Thread Simon L Peyton Jones


  panic! (the `impossible' happened):
  fun_result_ty: 6 GHC.Int#{-3e-}
   - GHC.Int#{-3e-}
   - b_trKC
   - PolyParse.HappyState{-rq9-} b_trKC c_trKD
   - [PolyParse.HappyState{-rq9-} b_trKC c_trKD]
   - c_trKD
  
 
 Thanks for the report, this looks suspiciously similar to the panic
 reported by Sven a couple of days ago (I attach Simon's reply to it.)
 
 Bottom line: try compiling the module with -fno-update-analysis and
 see if that side steps it.

Not so.  It was this problem:

  There are two problems.  One is a long-standing bit of grubbiness
  in the code generator; hence fun_result_ty panic.  I've fixed that
  (still grubbily, I fear).


I didn't produce a patch because I fixed it by modifying a number
of files in a tidy-up effort.   Yell if this is a show-stopper for you
and we'll accelerate a fixed 3.0

S




Re: Another panic

1998-02-06 Thread Simon L Peyton Jones


 The following works fine with GHC 3.00:
 
   class Variable v where
   updVar  :: v a - (a - IO (a,b)) - IO b
 
   applyVar :: Variable v = v a - (a - a) - IO a
   applyVar v f = updVar' v (\x - let x' = f x in (x',x')) 
 
 By changing the definition of applyVar to
 
   applyVar :: Variable v a = v a - (a - a) - IO a
   applyVar v f = updVar' v (\x - let x' = f x in (x',x'))
 
 where it appears as if the class Variable takes two type parameters, then
 GHC 3.00 panics with:
 
   panic! (the `impossible' happened):
   unifyKinds: length mis-match


Great bug report; easily fixed.  Thanks.  The fix will be in the
next (source) release.

S




Re: MPC Problem --- II

1998-02-06 Thread Simon L Peyton Jones


Nice point!  For the class decl

class (C a, D a) = E a where {...}

we use to generate superclass selectors:

scsel_E_C :: E a - C a
scsel_E_D :: E a - D a

But now there can be class decls like yours:

class (C a, C b) = E a b where {...}

and our naming convention bites the dust.  I've fixed this by
numbering them off instead:

C_sc1 :: E a b - C a
C_sc2 :: E a b - C b

This involves modifying several files, and then recompiling the whole
prelude, so it's not easy to send you a patch.  Nor can I see an
easy workaround.

Again, we'll release a bug-fixed 3.0 shortly.  Let us know how
urgent it is for you. If you don't yell it'll probably be a week or two.
It could be quicker if you yell.

Thanks for the report.

Simon


 The following class causes the generated C Code to cause an error:
 
   class (OMSObjectC o,OMSObjectC o',OMSLinkC l,OMSLnkAttrC a,OMSAttrValC av) = 
OMSLnkAttrAppC l o o' a av where
  lAGet  :: IO (a l o o') - IO av
  lAGet' :: (a l o o')- IO av
  lASet  :: IO (a l o o') - av - IO ()
  lASet' :: (a l o o')- av - IO ()
 
 The error produced is:

   /tmp/ghc27148.hc:679: redefinition of 
`OMSAttributes_scselZuOMSLnkAttrAppCOMSBaseOMSObjectC_info'
   /tmp/ghc27148.hc:643: `OMSAttributes_scselZuOMSLnkAttrAppCOMSBaseOMSObjectC_info' 
previously defined






Re: MPC Feature/Error ?

1998-02-05 Thread Simon L Peyton Jones


  My first attempt to define a MPC and some instances failed partially.
  I am not sure, if it is a bug or a feature. The class OMSObjAttrAppC
  should restrict the possible combinations of objects and attributes,
  so that attributes can only be read from objects to which they
  belong. Every Object which is an instance of Class OMSIdObjectC has a
  certain Attribute. So every instance of that class should be in the
  combinations that are permitted to use oAGet e.g.
  
  The MPC is:
  
class (OMSObjectC o,OMSObjAttrC a,OMSAttrValC av) = 
   OMSObjAttrAppC o a av where
  oAGet  :: IO (a o) - IO av
  oAGet' :: (a o)- IO av
  oASet  :: IO (a o) - av - IO ()
  oASet' :: (a o)- av - IO ()

Definite bug.  Thanks.

You can fix it thus.  In TcInstDecls, around line 683
change one line (the last line of the following snippet:

-- DERIVING CHECK
-- It is obviously illegal to have an explicit instance
-- for something that we are also planning to `derive'
  | maybeToBool tyconapp_maybe  clas `elem` (tyConDerivings inst_tycon)



Simon




Re: bug report

1998-02-05 Thread Simon L Peyton Jones


Mark says:

  data Blah = Blah
  type Tuple = (Blah,Int)
 
  instance Show Tuple where
showsPrec _ _ _
  = error []
 
 No instance for: `Show Blah'
 arising from use of `PrelBase.$mshowList', at tmp.lhs:8
 
 I know that instances of classes shouldn't be types, but that's
 what was so neat about ghc-2.** : they allowed types here.

3.0 is more consistent here.  Suppose I write

show (Blah,3)

should that show as a Tuple or as a (Blah,Int) pair?
What if it was (Show (Blah,3)::Tuple)?  Etc.  

Essentially, resolving overloading is incoherent if you 
allow overlapping instance decls.  

 So are types not longer allowed in instance declarations?

Yes they're allowed, but it's just as if you'd written the 
expanded type.  Any two instance decls that don't overlap are
allowed.  You can write

instance C (Blah,Int) where ..
instance C (Int,Int)  where ..
instance C (Blah, Bool) where ...

since none of these overlap.  But Show does have an instance for
(a,b) so you are stuck.  By "overlap" I mean that the instance
types can be unified.

Einar says:

 With the good old 2.something compiler, I could overwrite
 the default definition of Show for lists and other type constructors, e.g.:
 
 data Blah = Blah deriving (Read,Show)
 
 instance Show [Blah] where
   showsPrec d [] r =  r
   showsPrec d _ r  = "bla bla ..." ++ r
 
 Duplicate or overlapping instance declarations
   for `Show [Blah]'
   at PrelBase.mc_hi and Blah.hs

Same issue.  Show [a] exists already and overlaps with Show [Blah].

There is a full discussion of the bad consequences of overlapping
instance decls in the multi-parameter type class paper
http://www.dcs.gla.ac.uk/~simonpj/multi.ps.gz

3.0 is a bit more restrictive than 2.xx, but it is Righter I think.
Dissenting opinions welcome.

Simon





Re: MPC dunce question.

1998-02-04 Thread Simon L Peyton Jones

 
 While hacking around with MPCs, trying to define a variant of the
 Collection class, mutated to suit my own fiendish ends, I ran into
 this:
 
 Intervals.hs:345:
 Class type variable `e' does not appear in method signature
 union2 :: s - s - s
 
 What's the significance of this restriction?  (I presume this is caused
 by a requirement that all class vars appear in all methods.)  I don't
 recall it being mentioned in the proposal for MPCs, but I may have
 overlooked and/or forgotten it.  I don't suppose it ever really "bites"
 anyone, since one can always shove the appropriate constraint on the
 missing variable into the other methods, that do use it, but I don't
 instantly see why that's The Way to do it.

Not a dunce question.

I think the only reason for the original restriction in Haskell
(with one param) is this:

class C a where
  op :: Int - Int

Here op :: C a = Int - Int, and hence *any* use of op is bound
to be ambiguous... which instance of C should we choose?  Hence
the rule. 

In the multi-paramter case things are less clear:

class C a b where
   op :: a - a

Here, op :: C a b = a - a.  So suppose we use op on an Int argument.
Then we need to find an instance for (C Int b).  If there is an 
instance

instance C Int b where
   ...

(and of course no overlapping instances) then we are home.
Is that your stituation here?  Maybe it is.  But more likely
you have

instance C Int Int where
   ...

and now you're stuck, even if that's the only instance of (C Int t),
for any t.  Why stuck?  Because we ruled out "improvement" which
would instantiate b to Int in this case, on the grounds that 
its the only solution.

So, which is your situation?

Simon





Re: No field labels?

1998-02-04 Thread Simon L Peyton Jones

 Is there any reason for not allowing:
 
  data Test = Test {}
 
 in Haskell? 

I can't think of one.  Maybe Std Haskell should allow it.
I'll put it on the Std-Haskell board.

Simon





Re: Query on multi-param type classes

1998-01-30 Thread Simon L Peyton Jones


 I decided to try and get my old multi-param. parser to work,
 and got told-off by Haskell's parser:
 
 Please tell me what I am doing wrong.  The following program:
 
  module A where
   
  class (Monad m, Monad (t m)) = AMonadT t m where
lift :: m a - t m a
 
 Gives me:
 
 (lambda o) ghc -fglasgow-exts A.hs
  A.hs:3:23: parse error on input: "("

I should have said that I've implemented the choices given
on the Standard Haskell web discussion page.  In particular:

===
Choice 7a
~

The context in a class declaration (which introduces superclasses)
must constrain only type variables.  For example, this is legal:

class (Foo a b, Baz b b) = Flob a b where ...
but not
class (Foo [a] b, Baz (a,b) b) = Flob a b where ...

It might be possible to relax this restriction (which is the same
as in current Haskell) without losing decideability, but we're not
sure.  Choice 7a is conservative, and we don't know of any examples
that motivate relaxing the restriction.
===

I'm frankly unsure of the consequences of lifting the 
restriction.  Can you give a compact summary of why you want
to?  Our multi-parameter type-class paper gives none, and if
you've got one I'd like to add it.

In the short term, you're stuck.  Damn!  First customer too!

Simon





Re: Strange module exportation behavior

1998-01-29 Thread Simon L Peyton Jones


Conal: great bug report; thanks.  Meanwhile a workaround is
to use qualified names in the export list for Test2:

module Test2( Test1.foo, module Test2 )
  import Test1 hiding(main)
  main = ...

Inconvenient, but it should get you rolling.  

Simon, Sigbjorn: I've fixed this and checked in the changes (in rename/..).
Conal will need a new build in due course.

Simon

 I'm getting strange behavior from both GHC and Hugs w.r.t. module
 exportations.  They disagree with each other somewhat and both seem wrong,
 although I'm not certain I understand the report on this matter.
 
 Here are two test programs.  First Test1.hs:
 
 module Test1 (module Test1) where
 main = putStrLn "Test1's main"
 foo = "Test1 foo"
 
 In Test2.hs, I want to modify and extend Test1, keeping "foo" but
 replacing main.
 
 module Test2 (module Test1, module Test2) where
 import Test1 hiding (main)
 main = putStrLn "Test2's main"
 bar = foo ++ " plus Test2 bar"





Re: Fun with 3.00

1998-01-29 Thread Simon L Peyton Jones

 One can play funny games with GHC-3.00 and the following program
 (a small fragment of a Happy-generated parser):
 
 --
 module Foo ( happyParse ) where
 
 action_0 1 = \j tk _ - action_1 j j tk (HappyState action_1)
 
 action_1 3 = error "Bar"
 action_1 _ = \i tk st@(HappyState action) sts stk - action (-1) (-1) tk st sts 
(Just i : stk)
 
 happyParse = action_0 2 2 '-' (HappyState action_0) [] [] 2
 
 newtype HappyState b c =
HappyState (Int - Int - b - HappyState b c - [HappyState b c] - c)
 --

Great program!  Thanks for isolating it.

Simon: pls add to regression suite

There are two problems.  One is a long-standing bit of grubbiness
in the code generator; hence fun_result_ty panic.  I've fixed that
(still grubbily, I fear).

GHC goes into a loop in the update analyser.  Reason: the 
recursive contravariance of HappyState.  Consider:

action_1 j j tk (HappyState action_1) sts stk
= {unfold action_1}
action_1 (-1) (-1) tk (HappyState action_1) sts (Just j:stk)
= {unfold action_1 again}
... 

Neither action_0 nor action_1 is recursive, but infinite unfolding
can still occur.  This can cause the simplifier to loop, though
on this occasion it doesn't, but only because action_1 is
considered too big to unfold.  But it does make the update analyser
loop, for some obscure reason.  It wouldn't surprise me if the
strictness analyser looped too, but it doesn't.

For some reason there's no flag to switch off the update analyser.
It does very little good anyway, so just switch it off by force
in ghc/driver/ghc.lprl (look for -fupdate-anal).

I've known about the possibility of looping in the simpifier for some time, but
never seen it in a real program.  I have no idea how to spot it in a clean way,
and without disabling lots of useful inlining.  (I prevent looping mainly by
treating letrec carefully.)  Ideas welcome

Simon





Re: Pattern-matching strings.

1998-01-28 Thread Simon L Peyton Jones

 
 Is pattern-matching short strings (one or two characters) likely to be
 _vastly_ less efficient than matching against a single level of
 constructor?  (Order of magnitude, plus.)  Trying to make sense of some
 profiling numbers, here...

I believe it is.  Currently I think we call the overloaded
list-equality function, passing it the Eq dictionary for Char.
This could be improved in several ways -- specialisation and
(more directly) spotting short strings, dealing with common prefixes,
and calling a specialised eqString fn.

Speak up, people, if you want to push this up the priority queue.

Simon




Re: building ghc on new platform

1998-01-27 Thread Simon L Peyton Jones


Richard

 Is it true that one must have a working version
 of ghc on a new in order to port it to that platform
 or is there a "starters-kit" with which one can
 start such a port?

You can port by starting from the ".hc" files; that is, files
that have been compiled to C but not to machine code.
I don't think we've put up the .hc files in the releases for
a while, perhaps becuause there is a ghc available for most
platforms now.  Are you working on a new one?

Simon




Re: The impossible, again...

1998-01-27 Thread Simon L Peyton Jones


 panic! (the `impossible' happened):
 lookupBindC:no info!
  for: showsPrec_a7AK

Alex tickled a genuine, long-standing bug in the simplifier. Congratulation!
Here's the patch, to simplCore/Simplify.lhs

Simon

==


diff -c /users/fp/simonpj/fptools-multi-param/ghc/compiler/simplCore/Simplify.l
hs.\~5\~ /users/fp/simonpj/fptools-multi-param/ghc/compiler/simplCore/Simplify.
lhs
.cshrc (no TERM)
Non-interactive shell
*** /users/fp/simonpj/fptools-multi-param/ghc/compiler/simplCore/Simplify.lhs.~
5~  Fri Dec 19 22:34:15 1997
--- /users/fp/simonpj/fptools-multi-param/ghc/compiler/simplCore/Simplify.lhs   
Tue Jan 27 14:27:44 1998
***
*** 947,953 
  -- Try let-from-let
  simpl_bind env (Let bind rhs) | let_floating_ok
= tick LetFloatFromLet`thenSmpl_`
!   simplBind env (fix_up_demandedness will_be_demanded bind)
  (\env - simpl_bind env rhs) body_ty
  
  -- Try case-from-let; this deals with a strict let of error too
--- 947,954 
  -- Try let-from-let
  simpl_bind env (Let bind rhs) | let_floating_ok
= tick LetFloatFromLet`thenSmpl_`
!   simplBind env (if will_be_demanded then bind 
!  else un_demandify_bind bind)
  (\env - simpl_bind env rhs) body_ty
  
  -- Try case-from-let; this deals with a strict let of error too
***
*** 1276,1282 
  returnSmpl binds'
  
where
! (binds', _, n_extras) = fltBind bind  
  
  float_lets  = switchIsSet env SimplFloatLetsExposingWHNF
  always_float_let_from_let = switchIsSet env SimplAlwaysFloatLetsFromLets
--- 1277,1284 
  returnSmpl binds'
  
where
! binds'   = fltBind bind
! n_extras = sum (map no_of_binds binds') - no_of_binds bind 
  
  float_lets  = switchIsSet env SimplFloatLetsExposingWHNF
  always_float_let_from_let = switchIsSet env SimplAlwaysFloatLetsFromLets
***
*** 1284,1310 
-- fltBind guarantees not to return leaky floats
-- and all the binders of the floats have had their demand-info zapped
  fltBind (NonRec bndr rhs)
!   = (binds ++ [NonRec (un_demandify bndr) rhs'], 
!leakFree bndr rhs', 
!length binds)
where
  (binds, rhs') = fltRhs rhs
  
  fltBind (Rec pairs)
!   = ([Rec (extras
!  ++
!  binders `zip` rhss')],
!  and (zipWith leakFree binders rhss'),
!length extras
! )
! 
where
! (binders, rhss)  = unzip pairs
! (binds_s, rhss') = mapAndUnzip fltRhs rhss
!   extras   = concat (map get_pairs (concat binds_s))
  
! get_pairs (NonRec bndr rhs) = [(bndr,rhs)]
! get_pairs (Rec pairs)   = pairs
  
-- fltRhs has same invariant as fltBind
  fltRhs rhs
--- 1286,1307 
-- fltBind guarantees not to return leaky floats
-- and all the binders of the floats have had their demand-info zapped
  fltBind (NonRec bndr rhs)
!   = binds ++ [NonRec bndr rhs'] 
where
  (binds, rhs') = fltRhs rhs
  
  fltBind (Rec pairs)
!   = [Rec pairs']
where
! pairs' = concat [ let
!   (binds, rhs') = fltRhs rhs
! in
! foldr get_pairs [(bndr, rhs')] binds
!   | (bndr, rhs) - pairs
!   ]
  
! get_pairs (NonRec bndr rhs) rest = (bndr,rhs) :  rest
! get_pairs (Rec pairs)   rest = pairs  ++ rest
  
-- fltRhs has same invariant as fltBind
  fltRhs rhs
***
*** 1322,1333 
  -- fltExpr guarantees not to return leaky floats
= (binds' ++ body_binds, body')
where
! (body_binds, body')= fltExpr body
! (binds', binds_wont_leak, _) = fltBind bind
  
  fltExpr expr = ([], expr)
  
  -- Crude but effective
  leakFree (id,_) rhs = case getIdArity id of
ArityAtLeast n | n  0 - True
ArityExactly n | n  0 - True
--- 1319,1337 
  -- fltExpr guarantees not to return leaky floats
= (binds' ++ body_binds, body')
where
! binds_wont_leak = all leakFreeBind binds'
! (body_binds, body') = fltExpr body
! binds'= fltBind (un_demandify_bind bind)
  
  fltExpr expr = ([], expr)
  
  -- Crude but effective
+ no_of_binds (NonRec _ _) = 1
+ no_of_binds (Rec pairs)  = length pairs
+ 
+ leakFreeBind (NonRec bndr rhs) = leakFree bndr rhs
+ leakFreeBind (Rec pairs)   = and [leakFree bndr rhs | (bndr, rhs) - 
pairs]
+ 
  leakFree (id,_) rhs = case getIdArity id of
ArityAtLeast n | n  0 - True

Re: The impossible, again...

1998-01-26 Thread Simon L Peyton Jones

 
 Hi everybody-peeps.  I stumbled across the following panic, which occurs
 when I compile the given module with either -O2 or -O.  If it proves
 necessary I'll try to produce a small instance, ship the whole lot off
 to Glasgow, or otherwise poke around in search of illumination.

Try with -dcore-lint which will probably nail the problem
much more precisely.

A small instance would be appreciated!

Simon




Re: Profiling, again.

1998-01-26 Thread Simon L Peyton Jones

 
 Another non-killing, but rather annoying error message: this one is
 provoked by duplicate _scc_ labels.  I don't see the sense in this
 restriction, myself, but at any rate this wouldn't seem to be the
 handiest way of detecting same...
 
 Cheers,
 Alex.


Thanks -- this is another of the "we'll fix in the new profiler" kind.

Simon




Re: Profiling again.

1998-01-26 Thread Simon L Peyton Jones

 
 Doing a time-profiling of my current hackery tells me the worst offender
 in my program is:
 
   Intervals/$d11
 
 This raises and question or two in my mind...
 
 This represents the encoding of a dictionary (or a single method?)
 for some class with an instance declared in the given module, right?

Right.  It *is* curious, though.  You can find out what instance is involved
by looking at Intervals.hi; the instance decls in interface files say
what "$dn" they are attached to.

What's puzzling me is that not much time should accrue to dictionaries.
precisely because they are of no help to programmers.

Unless perhaps its the code *inside* the instance declaration... that
might get the dictionary cost centre... Hmm  Keep us posted.

Simon




Re: Simple usage of GHC

1998-01-12 Thread Simon L Peyton Jones


 year ago I had the pleasure of using a compiler (for BETA I think)
 where the basic usage for a novice was Just What You Wanted:
 
   % compiler Main.source
 
 From there it figured out which other modules it needed, which
 required recompiling, which object files and libraries where needed
 for linking etc.  I beleive that Objective Caml also has this,
 although you have to give an option (sigh).
 
 Why can't GHC (and all the others for that matter) be that simple?
 Hugs seems to do the right thing.

Good idea.  I'll put it on our wish-list.  The more people that
say "yes please" the more likely it is to get done!

Simon




Re: Mysterious 2.09 message.

1998-01-09 Thread Simon L Peyton Jones


 Alex Ferguson writes:
   
  Struct.hs:1: Failed to find interface decl for `Maybe'
  
  Compilation had errors
  make: *** [Struct.o] Error 1
  
  
  I'm guessing it has to do with out of date interface files (from ghc-2.07,
  to be exact), but it's not the most helpful way of finding out about this.

Just so you know that your cries are heeded, I've managed to add a much better
error location to messages of this sort.  It'll be in the upcoming 3.0 (which
has multi-parameter type classes among other things.)

simon




Re: Ambiguous Type Error

1998-01-05 Thread Simon L Peyton Jones

 I have enclosed below a test file that causes an error that puzzles
 me.  Both GHC and Hugs kick it out, so at least they agree; however, I
 must admit that I don't understand it.

Yes, it is a bit confusing, and it took me a few minutes
to see what is going on.

Here's your problem:

 data (Physical indep, Physical dep) = BasicSignal indep dep = 
 Pulse {start_time::indep,
pulse_width::indep,
amplitude::dep}

 pulse:: (Physical a, Physical b) = BasicSignal a b
 pulse = Pulse{start_time = toPhysical 0.0}
 
 example2:: BasicSignal Time Voltage
 example2 = pulse {start_time = (Sec 1.0),
   pulse_width = (Sec 3.0),
   amplitude = (V 2.0) }

The RHS of example2 is (by definition in the Report) equivalent to

  example2:: BasicSignal Time Voltage
  example2 = case pulse of
Pulse _ _ _ - Pulse (Sec 1.0) (Sec 3.0) (V 2.0)

The trouble is that "pulse" is overloaded, and the compiler
has no clue which overloading to use.  The fact that example2 has
a type signature doesn't help it, because none of the components
of "pulse" make it through to the output.  So it's ambiguous.

The solution is to say:

   example2 = (pulse :: BasicSignal Time Voltage) {
start_time = Sec 1.0,
pulse_width = Sec 3.0,
amplitude = V 2.0
}

Now the type signature on "pulse" tells it what overloading to use.

You may think it's silly that "pulse" needs to know what overloading
to use, even though the fields concerned (start_time in this case)
are immediately overwritten... but there's nothing to stop "pulse" 
returning two quite different records based on operations from the
Physical class, so you're asking the type system tp do something it
just can't.

Incidentally, the context on your data type declaration adds nothing 
useful.  (We're thinking of nuking them in Std Haskell.)

Simon






Re: panic! (the `impossible' happened) (2.09, patchlevel 0)

1997-12-19 Thread Simon L Peyton Jones

 {- 
Hi,

Compiling the following module results in the following error message
(with GHC 2.09, patchlevel 0, i386-linux
  
   
 --
panic! (the `impossible' happened):
 getWorkerIdAndCons area2{-r3g,x-}{i}


Thanks.  You've tickled a small but dark corner of the compiler.
If you have a source release you can replace the defn of getWorkerIdAndCons
in stranal/WorkWrap.lhs with the defn below.  Otherwise wait for 2.10.

Simon


getWorkerIdAndCons wrap_id wrapper_fn
  = go wrapper_fn
  where
go (Lam _ body)   = go body
go (Case _ (AlgAlts [(con,_,rhs)] _)) = let (wrap_id, cons) = go rhs
in  (wrap_id, cons `addOneToIdSet` con)
go (Let (NonRec _ (Coerce (CoerceOut con) _ _)) body) 
  = let (wrap_id, cons) = go body
in  (wrap_id, cons `addOneToIdSet` con)
go other  = (get_work_id other, emptyIdSet)

get_work_id (App fn _)= get_work_id fn
get_work_id (Var work_id) = work_id
get_work_id other = pprPanic "getWorkerIdAndCons" (ppr wrap_id)





Xmas fun

1997-12-19 Thread Simon L Peyton Jones


Folks,

I thought you might find the following bug I've just
found in GHC entertaining.

In the strictness analyser we need to compare abstract values
so that the fixpoint finder knows when to stop.  In the
middle of this code was the following:

sameVal :: AbsVal - AbsVal - Bool

...equations for sameVal...

sameVal (AbsApproxFun str1 v1) (AbsApproxFun str2 v2) 
= str1 == str2  sameVal v1 v1

...more equations...

That "sameVal v1 v1" should of course be "sameVal v1 v2".

Because sameVal reports equality more often that it should,
the fixpoint finder stops sooner than it should, so the strictness
analyser finds that things are strict when they aren't.  But
this only occurs in slightly obscure situations... and somehow
this bug has gone un-noticed for I dread to think how long!

It showed up in a nofib test -- sing ho for regression tests.

Happy xmas

Simon






Re: panic! (the `impossible' happened) in ghc-2.09

1997-12-04 Thread Simon L Peyton Jones

 ran into this bug when compiling with the optimization flags:
 -H10M -O -fvia-C -O2-for-C
 
 "
 NOTE: Simplifier still going after 4 iterations; bailing out.
 
 NOTE: Simplifier still going after 4 iterations; bailing out.
 
 panic! (the `impossible' happened):
 getWorkerIdAndCons deQueueWhile{-r85,x-}{i}

Woo!  Indeed wierd.  getWorkerIdAndCons is described in its
comments as "a rather crude function" and so it proves.

Since I don't have your sources, could you do a "-ddump-simpl" and
send us the results?

Simon




Re: Again: The impossible happened, this time in 2.08

1997-11-27 Thread Simon L Peyton Jones


Great bug report, thanks.  I've fixed it in the upcoming 2.09

S

 As a by-product of hacking Fudgets to death, a bug in ghc-2.08 showed up.
 Compiling the fragment from Fudgets below (the real names made as much
 sense to me as the ones below :-), ghc fails ungracefully:
 
 --
 module Foo where
 foo = bar
where bar = \baz - (baz + boing, truncate baz, truncate boing)
  boing = 0
 --





Re: Dinesh Vadhia: Haskell in the Real World ...

1997-11-21 Thread Simon L Peyton Jones


 Hi!  Before raising my questions please point me in the appropriate direction 
 if these questions have been asked before.  I have also just requested to join 
 the various Haskell mail lists.  My questions are really concerned with the 
 commercial viability of Haskell especially in the face of the Java juggernaut 
 and the development of quality software components.

Dinesh,

We're working hard on integrating GHC and Hugs with Microsoft's COM (Component
Object Model).  Perhaps CORBA too in due course, but COM is simpler.
That gives us a way to talk to Java (and vice versa).

There's a paper you may find interesting "Scripting COM components in Haskell",
at http://www.dcs.gla.ac.uk/~simonpj.

Erik Meijer and Daan Leijen is doing a lot of work on this too:
http://www.cse.ogi.edu/~erik/Personal/Default.html

As part of our planned GHC/Hugs integration we'll be doing a lot
more work on this component based programming stuff.


Do you have a direct interest in this?  An application? 

Simon




Re: `panic' at `Integer i' in ghc-2.08

1997-11-19 Thread Simon L Peyton Jones


This is definitely a bug.  Will be fixed!

Simon

 Compiling f :: Integer i = i 
   f =   0
  
 ghc-2.08  reports:  
 
   panic! (the `impossible' happened):
  tcLookupClass: PrelBase.Integer
 
   Please report it as a compiler bug to ...
  
 
 Probably, it has to say something more appropriate.
 
 
 --
 Sergey Mechveliani
 
 [EMAIL PROTECTED]





Re: Call for parsers

1997-11-13 Thread Simon L Peyton Jones


 So here is my call for contribution:
 
Send an abstract syntax and/or a parser specification!
 
 It doesn't matter if a parser generator is used or recursive descent
 techniques are applied.
 
 If there is enough echo, I'd like to setup a web page for this
 project, containing things to download, documentation, and giving a
 forum for discussions and applications of the parsers. Ideas range
 from simple formatting tools to library browsers ("I know there is a
 function with a similar type signature somewhere...") or computer
 aided source-to-source transformations.


Good for you!  

I believe that Happy is either very nearly capable of supporting
a full Haskell parser, or is so capable (but nobody has built one).
Simon Marlow can doubtless advise.

In any case, a Happy-based Haskell parser would be a wonderful
thing, along with (as you say) an abstract syntax.

One point: in GHC we are very careful to keep the full Haskell
syntax present in the abstract syntax tree.  So, 

x:[]

is represented differently from

[x]

in the algebraic data type representing a Haskell program.
That makes the data type quite a lot bigger, but it does mean
that you can print out exactly the program that was read in,
rather than an equivalent but bastardised version.


Keeping source-location info is also important.  

You might find GHC's HsSyn data types were a good start.  I don't
propose them as the Right Thing.

(And I definitely don't propose GHC's current parser.  Let's 
do it all in Haskell.)

Simon






Re: evil laziness in iteration

1997-11-05 Thread Simon L Peyton Jones

Sergey

Thanks for your various messages. I've explained your results below. 
You are right to say that
it's hard to be sure what optimisations will happen when; arguably
that's a bad shortcoming of functional programming (especially the lazy sort).
Profiling tools help a bit.
I think, though, that you have found some particularly bad cases.
I'd be very interested in other people's war stories.

It would be great to make a "performance debugging manual" for
GHC, but it's hard to find the time to do so.  Any volunteers?

Simon



You have clearly understood why

foldl1 max [1..n] 

takes lots of both heap and stack... first it builds up a huge
chain of suspended calls to max in the heap; and then when it finds
that it does need to evaluate them it takes a lot of stack to do so.

foldl_strict solves both problems by evaluating the accumulating
parameter as it goes:

foldl_strict k z [] = z
foldl_strict k z (x:xs) = let z1 = k z x
  in seq z1 (foldl_strict k z1 xs)

(I prefer "seq" to "strict", but its a matter of taste. Haskell provides
both.)  The seq forces z1 to be evaluated before the recursive call.

* Why isn't maximum in the Prelude defined using foldl_strict?  No
  reason: it should be.

* Why do sum2 and sum3 behave differently?

sum2 xs = sum' 0 xs
where
   sum' s [] = s
   sum' s (x:xs) = sum' (s+x) xs

sum3 xs = foldl (+) 0 xs

Well, sum3 behaves badly for the same reason as before.  sum2 behaves
well because the compiler can see that sum' is strict in its first
argument; that is, it can see that sum' will eventually evaluate s.
That's what the strictness analyser does.  Having figured out that
the first argument of sum' is sure to be evaluated in the end, 
the compiler arranges to evaluate the first argument of sum' before
the call --- which gives us back something rather like foldl_strict,
and thus good performance.

Why can't the compiler spot that for sum3?  Because in general
foldl does not necessarily evaluate its second argument.

How can we get the good behaviour for sum3?  By defining foldl like
this:

{-# INLINE foldl #-}
foldl k z xs = f z xs 
  where
f z [] = z
f z (x:xs) = f (k z x) xs

Now sum3's right-hand side will turn into sum2's right-hand side
(after inlining foldl) and all will be fine.  (If you use -O.)
The only down side is that some calls to foldl won't benefit
from the inlining and for those cases the code size might go up
a bit.  We should probably make this change to the Prelude.


* Why did you get different behaviour for sum1/sum2/sum3 etc
when you split the program over two modules?  I think it's 
because you didn't put in any type signatures.   When separately
compiled, ghc generated overloaded functions, which weren't
even sure that "+" is strict.  When compiled as one, it could
see that there was only one call to sumx, inline it at the
call site, and thereby discover which "+" you meant.

* Why does your call
foldStrict addR (R 0 b) ss
take lots of space?

foldlStrict  _ z [] =  z
foldlStrict  f z (x:xs) =  let  y = f z x  in 
  y `seq` (foldlStrict f y xs)


data  R = R Int Int

addR (R x b) (R y _) =  R (x+y) b 


Reason: R is a lazy constructor.  So the addR calls get done 
strictly (by foldStrict), but the (x+y) expressions don't get
evaluated because R is lazy.  You can fix this (as you did) by
declaring R to be strict, or by putting a seq in addR:

addR (R x b) (R y _) =  let z = x+y in
z `seq` R z b 

That, in fact, is what happens when you put an "!" in R's declaration.







Re: Building ghc-0.29 for on sunos4 and linux

1997-10-23 Thread Simon L Peyton Jones


What is happening is this.

- there's a SPECIALISE pragma involving data type Reg in FiniteMap
- SpecTyFuns imports FiniteMap
- Haskell 1.2 required closure on import, so Reg therefore had to
  be imported even though it's not used.  Haskell 1.4 doesn't have
  this silly restriction.

Solution: import AsmRegAlloc( Reg ) into SpecTyFuns (and maybe elsewhere)
or remove the SPECIALISE pragmas in FiniteMap (easier; and you won't lose
a lot of performance).

How this ever worked is beyond me!

Simon


 ghc-0.29 -c  -O -hi-diffs -link-chk -cpp -H12m  -fglasgow-exts -DCOMPILING_GHC   
   -fomit-derived-read -I. 
-iutils:basicTypes:uniType:abstractSyn:prelude:envs:rename:typecheck:deSugar:coreSyn:specialise:simplCore:stranal:stgSyn:simplStg:codeGen:nativeGen:absCSyn:main:reader:profiling:deforest:yaccParser
-DUSE_ATTACK_PRAGMAS -fshow-pragma-name-errs -fomit-reexported-instances 
-fshow-import-specs   -DUSE_NEW_READER=1 -DGRAN  -DOMIT_DEFORESTER=1   -o 
specialise/SpecTyFuns.o specialise/SpecTyFuns.lhs
 ld.so: warning: /usr/lib/libc.so.1.8 has older revision than expected 9
  
 "utils/FiniteMap.hi", line 14: undefined type constructor: Reg
  
 "utils/FiniteMap.hi", line 42: undefined type constructor: Reg
  
 "utils/FiniteMap.hi", line 52: undefined type constructor: Reg
 
 Compilation had errors
 make[1]: *** [specialise/SpecTyFuns.o] Error 1
 
 
 On the Sun I've managed to compile all of the runtime system but
 I have not got this far on my linux box. I've looked at the code
 and it looks like utils/FiniteMap.hs imports a file with constructor
 "Reg" defined, however this is never imported into 
 specialise/SpecTyFuns.lhs (and a host of other modules).
 
 Has anyone any idea what I need to do? Incidentally I think that this
 problem also occurs when I try to compile it without gransim enabled.
 
 Thanks,
 
 Nathan.
 





Re: Optimizing Haskell programs is hard

1997-10-17 Thread Simon L Peyton Jones


You are right to be "bugged" (see your last para).  Fortunately,
ghc 2.09 (i.e. our current working copy) gives identical runtimes for
all three. 

(Might be true of 2.08; I haven't tried.)

Simon

 I'm benchmarking MVar's and other shared memory abstractions, e.g. by
 accessing a variable a number of times:
 
 
 main :: IO ()
 main = do {
   v - newMVar 0;
   access v 1000
 } where access v 0 = print "done"
 -- 1) access v n = do {swapMVar v n; access v (n-1)}
 -- 2) access v n = do {takeMVar v; putMVar v n; access v (n-1)}
 -- 3) access v n = do {setVar v n; access v (n-1)}
 
 I'm accessing the MVar's according to 3 different schemes: one using swapMVar
 (1), the other using takeMVar and putMVar (2) and the third by calling a method of 
class 
 
   Variable var where
   setVar :: Var a - a - IO ()
   
   instance Variable MVar where
   setVar mvar val = do {takeMVar mvar; putMVar mvar val}
 
 These are the results that I get when running the programs:
 
   1) 42 secs
   2) 16 secs
   3) 32 secs
 
 It's quite surprising to see how much time swapMVar costs: I actually
 expected it to be faster than a take followed by a put. I have now
 spend some time getting rid of swapMVar wherever performance matters.
 
 One thing bugs me though: what is causing the overhead when I call the
 class operation, and - how can I get around it! This is quite
 important to the efficency of some of my basic abstractions such as
 reentrant monitors and the like.





Re: -O in ghc-2.08-linux-i386

1997-10-17 Thread Simon L Peyton Jones


 Joining these modules into *one*   
 
module Main (main) where ...
 
 and compiling with  -O  gives  6.1 sec.
 This is the *only* situation, I found -O working.
 
 Thus, if we set then  `Main (main,test) where' 
 
 - just for curiosity - we return to 70 sec.


GHC 2.08 still hasn't got the specialiser back in service, and it 
makes a HUGE difference on this program.  When test isn't exported,
it gets inlined and specialised.  Same thing happens if you say INLINE test
and INLINE short, even if test is exported.  

Nice example.  Maybe we shoudl get the specialiser up again!

Simon




Re: -O in ghc-2.08-linux

1997-10-16 Thread Simon L Peyton Jones


 A small program Main.hs compiled with -O runs 10 times faster or 
 slower depending on the export list
module Main (main)
 or module Main (main,test)
 Is this a bug?

I don't know without seeing the program. It's certainly extreme.
Can you send the code?

 Please, tell me, how to make  -O  work in  ghc-2.08-i386-all-linux ?
 I am trying  -O  in Makefile  for some critical modules of certain 
 large project, but cannot predict when this occures useful.

I'm not sure what you mean by "how to make -O work".  Just say "-O"
and your program should run faster.  How much faster does depend a lot
on the program.

 In  ghc-0.29  the performance for -O was predictable.

Aha! if you find a program where 0.29 makes a worthwhile improvement and
2.08 does not then please send it to us.  That way we can fix the holes
in 2.08.

 May it help to build ghc from sources (not so simple for 20Mbyte RAM)

I doubt it.

 does it worth to set things like  {-# inline 0 f #-}  or  
   {-# inline 1 f #-}  
 - has  ghc -c -O   (2.08 or future) to be clever enough to guess 
 what to inline?

It's worth it if you to be sure they'll be inlined.

Simon




Re: profiling

1997-10-14 Thread Simon L Peyton Jones


Marc

I strongly suspect that the names are simply truncated before they get
into a .hp file.  Doubtless this could be fixed.  

However, we're now embarking on building a new RTS, designed to support
both GHC and Hugs, so I'd rather just make sure that the new system doesn't 
truncate names.  (The new system will take a few months; don't hold your 
breath.)

Is this a show-stopper for you, or can you get by (perhaps by changing your
function names)?

Simon

 When I profile my sources (-auto-all) it happens
 a lot that I see names of the following form
 
  SourceFile:ModuleName/descripto
 
 in stead of names of the form
 
  SourceFile:ModuleName/descriptor_prefix_TYPE_SUFFIX
 
 as I would have expected.
 
 Is there a way to get the full function-names out of
 the profiling package? I have looked at the .hp files,
 but the information does not seem to be available at
 that level.
 
 
 Any help would be greatly appreciated.
 
 Regards,
 
 
 Marc van Dongen
 [EMAIL PROTECTED]





Re: Importing Prelude

1997-10-14 Thread Simon L Peyton Jones

 
   The Prelude module is imported automatically into all modules as if
   by the statement `import Prelude', if and only if it is not imported
   with an explicit import declaration. This provision for explicit
   import allows values defined in the Prelude to be hidden from the
   unqualified name space.

I've always taken "if it [the Prelude] is not imported with
an explicit import decl" to mean "if the Prelude is not imported with an
explicit import decl [of any form]".  So 

  import qualified Prelude
  
  main :: IO ()
  main = print Prelude.True

is invalid.  The Prelude is imported with an explicit import decl.
The fact that it's a qualified import decl doesn't matter.  Certainly,
the Report doesn't restrict its words to unqualified import.

Simon






Re: Universal quantification

1997-10-13 Thread Simon L Peyton Jones


  ghc -c -fglasgow-exts Test.lhs
  
 Test.lhs:11: Context `{Ord taDr}'
  required by inferred type, but missing on a type signature
  `Ord taDr' arising from use of `q' at Test.lhs:11
 In a polymorphic function argument `q'
 In the first argument of `Empty', namely `q'
 In a pattern binding: `e2 = Empty q'

It's a bug all right.  Curiously, the compiler doesn't handle overloading in 
polymorphic arguments; but neither does it reject constructors with overloaded
argument types. (Code written at different times!)  


I'll fix this.  There's no decent workaround until I do, I'm afraid.

Thanks for a fine report.


Simon




Re: pattern match

1997-10-10 Thread Simon L Peyton Jones

 
 I am using GHC 2.06 for Win32 and why does GHC complain about "possibly 
 incomplete patterns"?

It does so because it uses a brain-dead and incorrect way to detect
incomplete patterns.  (I can say this because I did it myself.) 

As we speak Juan Quintela is doing a Better Job.  It'll appear soon.

Thanks for the report.  I'm keen to know about obscure or (as in this case)
incorrect error messages.

Simon




Re: Compiler bug pops up when compiling ghc-2.07 with 2.02

1997-10-09 Thread Simon L Peyton Jones

 
 I tried, but it seems to complete successfully. Nevertheless, later
 make all fails with the same message.
 Probably the dependencies generated by make boot (or make depend, for
 that matter) _are_ circular (citing from ghc/compiler/.depend):
 
 utils/FastString.o : basicTypes/Unique.hi
 
 basicTypes/Unique.o : utils/Pretty.hi
 
 utils/Pretty.o : utils/FastString.hi

My guess is that you have not got the line

Ghc2_0= YES

in your build.mk file.  GHC 2.x uses a different way of dealing with
mutual recursion (via M.hi-boot files) than GHC 0.29 (which used M.lhi files).

When you make that change to build.mk you'll need to do a make depend again.

Yes, this should be documented.

Remember, don't bother to use 2.06 or earlier for self-booting.  Only
2.07 is up to it.

Simon




Re: 100 MB of heap exhausted when compiling Happy

1997-09-22 Thread Simon L Peyton Jones

 
 When compiling Happy-1.3's Main.lhs with optimization turned on, 100 MB of
 heap space aren't enough for the compiler. It crashes after several hours 
 of compilation. When optimization is turned off, it finishes after some 
 minutes. I would consider this a bug.

I consider it a bug too!  We knew about it, but havn't spent the time
to nail it down yet.  It's in our gunsights though.

Simon





Re: profiling optimised code

1997-09-08 Thread Simon L Peyton Jones


 Main.o(.text+0x9c): undefined reference to `PrelBase_Z36g3J_fast3'

That's odd.  It's usually a sign that PrelBase.hi and PrelBase.o
weren't generated by the same run of GHC.

You could try doing "make clean; make depend; make" in your ghc/lib
directory.

If that doesn't work we'll need to investigate a little more!

 I am trying to determine to what extent ghc approximates full laziness

GHC implements full laziness more or less as described in our SPE paper 
SL Peyton Jones and D Lester, A modular fully-lazy lambda 
lifter in Haskell, Software Practice and Experience 21(5), 
May 1991, pp479-506. 

There's an online version attached to my publications page
(start at http://www.dcs.gla.ac.uk/~simonpj).   

There are measurements of its effectiveness in 

SL Peyton Jones, WD Partain, A Santos, Let-floating: 
moving bindings to give faster programs 
Proc International Conference on Functional Programming, 
Philadelphia (ICFP'96), May 1996

(online version in the same place)

 The program below compiles and links with -O -prof, but seg-faults
 during execution (it is fine with just -prof) :

Try the fix given on 
http://www.dcs.gla.ac.uk/fp/software/ghc/ghc-bugs.html#2.05-mangler


Simon




Re: building with ghc-2.03

1997-09-08 Thread Simon L Peyton Jones

 Hi !
 I am trying to build ghc-2.05 libraries with ghc-2.03 on RS6000 and
 get the following problem:

That's a very curious thing to do! You should only compile
the ghc-2.05 libraries with ghc-2.05.  The format of interface
files has changed often. The complaint you are getting is that
ghc-2.03 doesn't recognise PrelNum.hi as syntactically correct.

You didn't give enough information for me to understand 
the details of what's going wrong, but my advice would be this.
Build the 2.05 compiler with ghc-0.29.  Then build the libraries
with 2.05.  (The latter will happen automatically unless you
subvert the make system.)  2.03 is not a good compiler to use for
booting purposes.

I hope this helps.

Simon




Re: Another question about monads and linearity

1997-09-04 Thread Simon L Peyton Jones


 There are few formal connections between monads and
 single-threaded state... For any state-transformer monad... there
 is a trivial operation... that will instantly destroy any hope for
 single-threadedness: getState s = (s, s)
 
 In day-to-day Haskell 1.3 programming what is the solution to this
 problem? If a program is using a C interface to create a mutable
 state, say an external database, what is the simplest way of
 encapsulating the state so that this "trivial operation" cannot be
 defined except by the original author?

A monad is an abstract data type.  The implementation of an ADT is
hidden, so that the client of the ADT can't invalidate its invariants.

Its the same for state-transformer monads.  The programmer can't define
this trivial operaion, because all s/he has is an ADT with an
interface like

newVar :: a - ST (MutVar a)
readVar :: MutVar a - ST a
writeVar :: MutVar a - a - ST ()
thenST :: ST a - (a - ST b) - ST b
returnST :: a - ST a

With those operations you simply can't define getState.

You might find "State in Haskell" useful (from my Web page).

Haskell 1.3 doesn't define any state-transformer monads, but both GHC
and Hugs do.

Simon






Re: Compiler Crash: ghc-2.05:panic! lookupBindC:no info!

1997-09-03 Thread Simon L Peyton Jones


 The following program (rather condensed as it comes from something
 much larger), crashes the compiler. I am using ghc-2.05 on a Solaris
 box, with one or two patches(including the WwLib one which Simon gave
 me, but this error occured before applying this patch).

Thanks for this bug report, and for localising it so well.  
It's easily fixed, I'm glad to say.
In
ghc/compiler/simplCore/SimplCore.lhs

replace this equation of tidyCoreExpr.  The lines marked "!!" are the
changed ones.

Simon


 -- Eliminate polymorphic case, for which we can't generate code just yet
tidyCoreExpr (Case scrut (AlgAlts [] (BindDefault deflt_bndr rhs)))
  | not (typeOkForCase (idType deflt_bndr))
  = pprTrace "Warning: discarding polymorphic case:" (ppr PprDebug scrut) $
case scrut of
!!  Var v - lookupId v `thenTM` \ v' -
!!   extendEnvTM deflt_bndr v' (tidyCoreExpr rhs)
other - tidyCoreExpr (Let (NonRec deflt_bndr scrut) rhs)





Re: Standard Haskell

1997-08-25 Thread Simon L Peyton Jones


 In fact, I would like to hear what all the major implementors have as their
 picture of a final version of Haskell.  You've all been pretty quiet.
 I assume you've all already aired your opinions at the workshop, but it would
 be nice to see them here as well.

Reasonable request.
I hope that my contributions to the Std Haskell Web page pretty much
say what I think.  I'm happy to fill in any gaps if someone identifies them.

Simon






Re: what's wrong with instance C a = D a

1997-08-22 Thread Simon L Peyton Jones

 The report says explicit that instance declarations like
 instance C (a,a) where ...,  or for (Int,a) or for [[a]] are not

 I now only would like to know why this design decission was made,
 are there any problems with the instance declarations I have in mind?

You might find "Type classes - exploring the design space" useful.

http://www.dcs.gla.ac.uk/~simonpj/multi.ps.gz

Simon






  1   2   >