On 6 Jan 2008, at 5:32 AM, Yitzchak Gale wrote:

(sorry, I hit the send button)

What is the lifted version you are referring to?

Take the ordinary disjoint union, and then add a new _|_ element,
distinct from both existing copies of _|_ (which are still distinct
from each other).

Now why is that not the category-theoretic coproduct?
h . Left = f and h . Right = g both for _|_ and for finite
elements of the types. And it looks universal to me.

Not quite. The only requirement for h _|_ is that it be <= f x for all x and <= g y for all y. If f x = 1 = g y for all x, y, then both h _|_ = _|_ and h _|_ = 1 are arrows of the category. So the universal property still fails.

 Of course, Haskell makes things even worse by lifting the
product and exponential objects,

OK, what goes wrong there, and what is the lifting?

Again, in Haskell, (_|_, _|_) /= _|_. The difference is in the function
f (x, y) = 1 which gives f undefined = undefined but
f (undefined, undefined) = 1.

I don't get that one. Given any f and g, we define h x = (f x, g x).
Why do we not have fst . h = f and snd . h = g, both in Hask
and StrictHask? fst and snd are strict.

Again, if f and g are both strict, we have a choice for h _|_ --- either h _|_ = _|_ or h _|_ = (_|_, _|_) will work (fst . h = f and snd . h = g), but again these are different morphisms.


 Unfortunately, this means that (alpha, beta) has an extra _|_
element < (_|_, _|_), so it's not the categorical product (which
would lack such an element for the same reason as above).

The reason you can't adjoin an extra element to (A,B) in, say,
Set, is that you would have nowhere to map it under fst
and snd. But here that is not a problem, _|_ goes to _|_
under both.

This is partly an implementation issue --- compiling pattern matching without introducing such a lifting requires a parallel implementation

That's interesting. So given a future platform where parallelizing
is much cheaper than it is today, we could conceivably have
a totally lazy version of Haskell. I wonder what it would be like
to program in that environment, what new problems would arise
and what new techniques would solve them. Sounds like a nice
research topic. Who is working on it?

--- and partly a semantic issue --- data introduces a single level of
lifting, every time it is used, and every constructor is completely
lazy.

Unless you use bangs. So both options are available, and that
essentially is what defines Haskell as being a non-strict language.

(!alpha, !beta) isn't a categorical product, either. snd (undefined, 1) = undefined with this type.


Functions have the same issue --- in the presence of seq, undefined /
= const undefined.

Right. I am becoming increasingly convinced that the seq issue
is a red herring.

Care to give an explanation?


Extensionality is a key part of the definition of all of these
constructions.  The categorical rules are designed to require, in
concrete categories, that the range of the two injections into a
coproduct form a partition of the coproduct, the surjective pairing
law (fst x, snd x) = x holds, and the eta reduction law (\ x -> f x)
= f holds. Haskell flaunts all three; while some categories have few enough morphisms to get away with this (at least some times), Hask is
not one of them.

That interpretation is not something that is essential in the notion
of category, only in certain specific examples of categories
that you know.

I understand category theory. I also know that the definitions used are chosen to get Set `right', which means extensionality in that case, and are then extended uniformly across all categories. This has the effect of requiring similar constructions to those in Set in other concrete categories.

Product and coproduct in any given category -
whether they exist, what they are like if they exist, and what alternative constructions exist if they do not - reflect the nature of the structure
that the category is modeling.

I understand that.  I'm not sure you do.


I am interested in understanding the category Hask that represents
what Haskell really is like as a programming language.

Good luck.

Not
under some semantic mapping that includes non-computable
functions, and that forgets some of the interesting structure
that comes from laziness (though that is undoubtably also
very interesting).

Bear in mind in your quest, that at the end of it you'll most likely conclude, like everyone else, that good old equational reasoning is sound for the programs you actually right at least 90% of the time (with appropriate induction principles), and complete for at least 90% of what you want to right, and go back to using it exclusively for real programming.

jcc


_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe

Reply via email to