I don't think this is a bug. I do not expect to be able to unfold a definition
without some syntactic issues. For example,
two = 1+1
four = 2 * two
but unfolding fails (four = 2 * 1 + 1). In general, we expect to have to
parenthesize things when unfolding them.
On Feb 13, 2010, at
that `---` and (---) are not dual.
Can you regain duality by *also* providing a definition of `---` and
(---) at the level of the context free grammar?
John Launchbury | galois | (503)626-6616 x104
On Jul 9, 2009, at 4:22 AM, Simon Marlow wrote:
On 08/07/2009 23:06, k
Haskell-prime mailing list
John Launchbury | galois | (503)626-6616 x104
Yes. Let me be clear.
It is not the fact that `seq` operates on functions that breaks foldr/build:
it is the fact that `seq` claims to be parametrically polymorphic when in
fact it is not. The parametricity result is weakened to the point that the
foldr/build proof no longer applies, and a
I watched with interest the discussion about the ravages of `seq`.
In the old days, we protected uses of `seq` with a type class, to keep it at
bay. There was no function instance (so no problem with the state monads, or
lifting of functions in general), and the type class prevented interference
My questiona are: Were the designers of the implicit
parameters paper aware of this problem when they wrote the
paper? If so, they probably did not think this was a big
problem. Do people in general think this is a problem?
We certainly were aware. It is a problem, and a big one. The
I strongly disapprove of n+k patterns from a whole-language taste
perspective, so I am most unkeen to broaden their scope. Because they are
such a language kludge already it simply doesn't make sense to try to reason
rationally about what the best answer for them is. It's like putting
As I recall, this was something we addressed in the original paper,
indicating the extreme danger of the monomorphism restriction in the
context of implicit paramters. They simply don't mix.
Down with the MR. Let's settle on a viable alternative as soon as possible.
On the more general
I agree with Koen: \\ is list subtraction and we're all used to subtraction being left
Jon Fairbairn wrote:
On Wed, 17 Jan 2001, Koen Claessen wrote:
I propose that it gets the following fixity:
infixl 5 \\
Unless the it's common usage outside of Haskell, I
A discussion thread earlier this year asked about the existence (or otherwise) of jobs
involving programming in Haskell and other FP languages. Well, they really do exist,
and we have some openings. If you're interested you can find out more at
If a language has the property that in one place, one can
use a "let" block to define polymorphic bindings, and in
another place one can only use it for monomorphic bindings,
then I think that is bad language design.
I don't think that's a very accurate description. The "let" in "do"
Isn't that a bit of a dodgy argument? I don't know of any papers on `in'
polymorphism, but many about `let' polymorphism. If I see `let', I expect
polymorphism, and I'm not going to go searching for an `in'.
Not true (or if true, misguided). Consider the type rule for "let". The
Note that these have omitted the behavior of the function on bottom, so even on a
finite domains, the description is not complete.
[EMAIL PROTECTED] wrote:
There is no problem with Showing functions with finite domains.
For example, try:
module ShowFun where
I don't know any way to make unsafePerformIO type-safe without imposing
some drastic or complicated restriction. Something in the back of
my mind tells me that John Launchbury has another example of
type unsafety induced by unsafePerformIO but I can't remember what;
so I'm cc'ing him
Jan Brosius wrote:
I have read the online postscript paper " Lazy Functional State Threads".
I think the typing (page 4)
f :: MutVar s a - Mutvar s a
is wrong, since the definition of f is :
f v = runST ( newVar v 'thenST' \w -
I think you mean
My student Levent Erkok and I have been playing about with the idea of mutual
recursive bindings in the do notation. Mostly it's clear how they should behave, but
we struggle with the following example. I would love to hear some (considered)
opinions about this.
I think I favor "20th century Haskell" myself :-)
On 9/8/98 5:10 PM, Andrew Rock wrote
If Standard Haskell is meant to be a stable target for texts and the like,
why not Haskell-Ed (for Education), perhaps with a version indication like
In the Libraries: state of play document, the sixth point is
1. There are other library functions in a similar vein to
take and drop, such as splitAt, !!, elemIndex, findIndex;
I take it that these are also covered.
I hope so
When we discussed this before I appealed for
I love Simon's suggestion. It gives me all the right vibes. And - seems to
me to be the right connective to use.
At the risk of beating my hobby horse, let's not think of - solely in
terms of monads. It is certainly appropriate there, but it is also
appropriate for lists when thought of purely
Soon after offering my home page as a source of the paper Phil
mentioned the other day, we had a disk go bad which led to
Your client does not have permission to get URL /~jl from this server.
The problem has now been fixed, so you should be able to get through.
I would like to respond to John's note. My response is largely positive,
though I disagree with a couple of points.
However, it is an independent question whether or not strictness annotations
should be applicable to function types. And this is where I disagree with
the committee. To quote
John says we can't go from a function to its concrete representation
ie E - [E] - OK. He hints that implementations are from concrete
representations to real functions ie [E] - E. I disagree profoundly.
I'm not surprised you disagree. I hinted no such thing. Implementations
Greg argument's about function denotations is valid up to a point. However,
when optimising functional programs we rely on *denotational* equivalence
to justify substitution, not just beta equivalence. So for example, we
can prove that
foldr (:)  = id :: [a]-[a]
in the sense that given any
I have a followup question to Lennart's motivated by the same
The Haskell report states that in an abbreviated module, the
header is assumed to be
module Main where
which makes all the identifiers exportable (and so the monomorphism
restriction could bite the unwary). I tend to
Some people have referred to semantic issues and Abramsky and Ong's work
when contributing to the lifted/unlifted debate. I think it would be fair
to summarise them as follows.
The simplest possible lambda calculus has lifting in its canonical model as
More on liftings:
In our FPCA 91 paper, Simon and I came to the conclusion that the "proper"
way to give semantics to data declarations was, as follows. If
data T = A U V | B W
then the model for T (written here T*) is
T* = ((U* x V*) + W*)_\bot
where the x is pure domain product, the +
I feel the need to be inflamatory:
I believe n+k should go.
There are lots of good reasons why they should go, of course. The question
is: are there any good reasons why they should stay? My understanding is
that the only reason they are advocated is that they make teaching
It sounds to me as if the problem is with negative numbers.
So, one more time ... What about the *natural* numbers?
Doesn't anyone else program with these? (Maybe just occasionally? :-)
The problem is only partly to do with naturals. Having these would
certainly improve matters but I suspect
But just because they call it `lazy' doesn't mean that it really is
the essence of laziness. I prefer to use the more neutral name `lifted
lambda calculus' for their calculus.
I disagree. In the simplest case (just lambdas, variables and applications,
i.e. no explicit constructors), it is
I think there is another problem with having strict constructors. It messes
up parametricity even more than fix does. There are two reasons why this
would be a shame:
* Parametricity is cool. It lets you prove lots of interesting theorems
for very little work. These theorems help with program
Here are three comments directed particularly at Haskell 1.3 people, but
obviously open to general feedback.
1. We should get rid of Assoc.
When explaining my programs to other people I find this is a point of
confusion. Imagine exaplaining array construction, "When I define an array,
But I think we can have the cake and eat it too, if we get rid of the
restriction (which I never liked) that operators beginning with : must be a
constructor: just define
a := b = (a,b)
Unfortunately that won't work if := had been used in patterns. I think
backward compatibility is an issue.
Mail list logo