Folks,
Soon after offering my home page as a source of the paper Phil
mentioned the other day, we had a disk go bad which led to
403 Forbidden
Your client does not have permission to get URL /~jl from this server.
The problem has now been fixed, so you should be able to get through.
I would like to respond to John's note. My response is largely positive,
though I disagree with a couple of points.
However, it is an independent question whether or not strictness annotations
should be applicable to function types. And this is where I disagree with
the committee. To quote
John says we can't go from a function to its concrete representation
ie E - [E] - OK. He hints that implementations are from concrete
representations to real functions ie [E] - E. I disagree profoundly.
I'm not surprised you disagree. I hinted no such thing. Implementations
manipulate
Greg argument's about function denotations is valid up to a point. However,
when optimising functional programs we rely on *denotational* equivalence
to justify substitution, not just beta equivalence. So for example, we
can prove that
foldr (:) [] = id :: [a]-[a]
in the sense that given any
I think there is another problem with having strict constructors. It messes
up parametricity even more than fix does. There are two reasons why this
would be a shame:
* Parametricity is cool. It lets you prove lots of interesting theorems
for very little work. These theorems help with program
But I think we can have the cake and eat it too, if we get rid of the
restriction (which I never liked) that operators beginning with : must be a
constructor: just define
a := b = (a,b)
Unfortunately that won't work if := had been used in patterns. I think
backward compatibility is an issue.
I feel the need to be inflamatory:
I believe n+k should go.
There are lots of good reasons why they should go, of course. The question
is: are there any good reasons why they should stay? My understanding is
that the only reason they are advocated is that they make teaching
induction easier.
It sounds to me as if the problem is with negative numbers.
So, one more time ... What about the *natural* numbers?
Doesn't anyone else program with these? (Maybe just occasionally? :-)
The problem is only partly to do with naturals. Having these would
certainly improve matters but I suspect
Some people have referred to semantic issues and Abramsky and Ong's work
when contributing to the lifted/unlifted debate. I think it would be fair
to summarise them as follows.
Pro lifting
===
The simplest possible lambda calculus has lifting in its canonical model as
soon as
More on liftings:
In our FPCA 91 paper, Simon and I came to the conclusion that the "proper"
way to give semantics to data declarations was, as follows. If
data T = A U V | B W
then the model for T (written here T*) is
T* = ((U* x V*) + W*)_\bot
where the x is pure domain product, the +
I have a followup question to Lennart's motivated by the same
IOHCC entry.
The Haskell report states that in an abbreviated module, the
header is assumed to be
module Main where
which makes all the identifiers exportable (and so the monomorphism
restriction could bite the unwary). I tend to
I love Simon's suggestion. It gives me all the right vibes. And - seems to
me to be the right connective to use.
At the risk of beating my hobby horse, let's not think of - solely in
terms of monads. It is certainly appropriate there, but it is also
appropriate for lists when thought of purely
I think I favor "20th century Haskell" myself :-)
Hassett wrote:
On 9/8/98 5:10 PM, Andrew Rock wrote
If Standard Haskell is meant to be a stable target for texts and the like,
why not Haskell-Ed (for Education), perhaps with a version indication like
Haskell-Ed-98.
Unfortunately,
Folks,
My student Levent Erkok and I have been playing about with the idea of mutual
recursive bindings in the do notation. Mostly it's clear how they should behave, but
we struggle with the following example. I would love to hear some (considered)
opinions about this.
John.
Jan Brosius wrote:
I have read the online postscript paper " Lazy Functional State Threads".
I think the typing (page 4)
f :: MutVar s a - Mutvar s a
is wrong, since the definition of f is :
f v = runST ( newVar v 'thenST' \w -
readVar w)
I think you mean
Note that these have omitted the behavior of the function on bottom, so even on a
finite domains, the description is not complete.
john.
[EMAIL PROTECTED] wrote:
George writes:
There is no problem with Showing functions with finite domains.
For example, try:
module ShowFun where
Koen,
If a language has the property that in one place, one can
use a "let" block to define polymorphic bindings, and in
another place one can only use it for monomorphic bindings,
then I think that is bad language design.
I don't think that's a very accurate description. The "let" in "do"
Jeff,
Isn't that a bit of a dodgy argument? I don't know of any papers on `in'
polymorphism, but many about `let' polymorphism. If I see `let', I expect
polymorphism, and I'm not going to go searching for an `in'.
Not true (or if true, misguided). Consider the type rule for "let". The
Hi,
A discussion thread earlier this year asked about the existence (or otherwise) of jobs
involving programming in Haskell and other FP languages. Well, they really do exist,
and we have some openings. If you're interested you can find out more at
http://www.galconn.com.
John.
I agree with Koen: \\ is list subtraction and we're all used to subtraction being left
associative.
John.
Jon Fairbairn wrote:
On Wed, 17 Jan 2001, Koen Claessen wrote:
I propose that it gets the following fixity:
infixl 5 \\
Unless the it's common usage outside of Haskell, I
I strongly disapprove of n+k patterns from a whole-language taste
perspective, so I am most unkeen to broaden their scope. Because they are
such a language kludge already it simply doesn't make sense to try to reason
rationally about what the best answer for them is. It's like putting
lipstick on
My questiona are: Were the designers of the implicit
parameters paper aware of this problem when they wrote the
paper? If so, they probably did not think this was a big
problem. Do people in general think this is a problem?
We certainly were aware. It is a problem, and a big one. The
I watched with interest the discussion about the ravages of `seq`.
In the old days, we protected uses of `seq` with a type class, to keep it at
bay. There was no function instance (so no problem with the state monads, or
lifting of functions in general), and the type class prevented interference
Yes. Let me be clear.
It is not the fact that `seq` operates on functions that breaks foldr/build:
it is the fact that `seq` claims to be parametrically polymorphic when in
fact it is not. The parametricity result is weakened to the point that the
foldr/build proof no longer applies, and a
I don't know any way to make unsafePerformIO type-safe without imposing
some drastic or complicated restriction. Something in the back of
my mind tells me that John Launchbury has another example of
type unsafety induced by unsafePerformIO but I can't remember what;
so I'm cc'ing him
--
Skype: robert.will
___
Haskell-prime mailing list
Haskell-prime@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-prime
John Launchbury | galois | (503)626-6616 x104
___
Haskell-prime mailing
that `---` and (---) are not dual.
Can you regain duality by *also* providing a definition of `---` and
(---) at the level of the context free grammar?
John
John Launchbury | galois | (503)626-6616 x104
On Jul 9, 2009, at 4:22 AM, Simon Marlow wrote:
On 08/07/2009 23:06, k
I don't think this is a bug. I do not expect to be able to unfold a definition
without some syntactic issues. For example,
two = 1+1
four = 2 * two
but unfolding fails (four = 2 * 1 + 1). In general, we expect to have to
parenthesize things when unfolding them.
John
On Feb 13, 2010, at
28 matches
Mail list logo