Re: [Haskell-cafe] Basic question concerning the category Hask (was: concerning data constructors)

2008-01-06 Thread Yitzchak Gale
I wrote:
 What goes wrong with finite coproducts? The obvious thing to
 do would be to take the disjoint union of the sets representing the
 types, identifying the copies of _|_.

Jonathan Cast wrote:
 This isn't a coproduct.  If we have f x = 1 and g y = 2, then there
 should exist a function h such that h . Left = f and h . Right = g...
 But by your rule, Left undefined = Right undefined...
 Which is a contradiction...

OK, thanks.

 Unless, of course,
 your require your functions to be strict --- then both f and g above
 become illegal, and repairing them removes the problem.

You don't have to make them illegal - just not part
of your notion of coproduct. That is an entirely
category-theoretic concept, since Empty is bi-universal,
and a morphism is strict iff the diagram

   f
A --- B
 \ ^
  v/
 Empty

commutes.

However, the coproduct you get is the one I suggested,
namely Either !A !B, not the one we usually use.

  What is the lifted version you are referring to?

 Take the ordinary disjoint union, and then add a new _|_ element,
 distinct from both existing copies of _|_ (which are still distinct
 from each other).

 
   Of course, Haskell makes things even worse by lifting the
  product and exponential objects,
 
  OK, what goes wrong there, and what is the lifting?

 Again, in Haskell, (_|_, _|_) /= _|_.  The difference is in the function

 f (x, y) = 1

 which gives f undefined = undefined but f (undefined, undefined) =
 1.  Unfortunately, this means that (alpha, beta) has an extra _|_
 element  (_|_, _|_), so it's not the categorical product (which
 would lack such an element for the same reason as above).
 This is partly an implementation issue --- compiling pattern matching
 without introducing such a lifting requires a parallel implementation
 --- and partly a semantic issue --- data introduces a single level of
 lifting, every time it is used, and every constructor is completely
 lazy.

 Functions have the same issue --- in the presence of seq, undefined /
 = const undefined.

 Extensionality is a key part of the definition of all of these
 constructions.  The categorical rules are designed to require, in
 concrete categories, that the range of the two injections into a
 coproduct form a partition of the coproduct, the surjective pairing
 law (fst x, snd x) = x holds, and the eta reduction law (\ x - f x)
 = f holds.  Haskell flaunts all three; while some categories have few
 enough morphisms to get away with this (at least some times), Hask is
 not one of them.

 jcc


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Basic question concerning the category Hask (was: concerning data constructors)

2008-01-06 Thread Derek Elkins
On Wed, 2008-01-02 at 15:49 +0200, Yitzchak Gale wrote:

[...]

 Some people are worried that this version of Hask is missing
 certain nice properties that one would like to have. For
 example, it was recently claimed on this list that tuples
 are not products in that category. (Or some such. I would be
 interested to see a demonstration of that.)

Johnathan has given such a demonstration (and it has been demonstrated
many times on this list since it's creation, it's well-known).

 I am not impressed by those complaints. As usual in category
 theory, you define corresponding notions in Hask, and prove
 that they are preserved under the appropriate functors.
 That should always be easy. And if ever it is not, then you
 have discovered an interesting non-trivial consequence of
 laziness that deserves study.

You are right not to be impressed by such complaints, but you
misrepresent people's views on this by saying that the worry about
such problems.

As you say (people say), these properties [that Hask is cartesian closed
to start] would be nice to have and are very convenient to assume which
is often safe enough.  Certainly computer scientists of a categorical
bent have developed (weaker) notions to use; namely, monoidal,
pre-monoidal, Freyd, and/or kappa categories and no doubt others.  Using
these, however, removes some of the allure of using a categorical
approach.  Also, there is a Haskell-specific problem at the very get-go.
The most obvious choice for the categorical composition operator
assuming the obvious choice for the arrows and objects does not work,
it does not satisfy the laws of a category assuming the = used in them
is observational equality.  Namely, id . f /= f /= f . id for all
functions f, in particular, it fails when f = undefined.  This can
easily be fixed by making the categorical (.) strict in both arguments
and there is no formal problem with it being different from Haskell's
(.), but it certainly is not intuitively appealing.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Basic question concerning the category Hask (was: concerning data constructors)

2008-01-06 Thread Yitzchak Gale
(sorry, I hit the send button)

 What is the lifted version you are referring to?

 Take the ordinary disjoint union, and then add a new _|_ element,
 distinct from both existing copies of _|_ (which are still distinct
 from each other).

Now why is that not the category-theoretic coproduct?
h . Left = f and h . Right = g both for _|_ and for finite
elements of the types. And it looks universal to me.

  Of course, Haskell makes things even worse by lifting the
 product and exponential objects,

 OK, what goes wrong there, and what is the lifting?

 Again, in Haskell, (_|_, _|_) /= _|_.  The difference is in the function
 f (x, y) = 1 which gives f undefined = undefined but
 f (undefined, undefined) = 1.

I don't get that one. Given any f and g, we define h x = (f x, g x).
Why do we not have fst . h = f and snd . h = g, both in Hask
and StrictHask? fst and snd are strict.

  Unfortunately, this means that (alpha, beta) has an extra _|_
 element  (_|_, _|_), so it's not the categorical product (which
 would lack such an element for the same reason as above).

The reason you can't adjoin an extra element to (A,B) in, say,
Set, is that you would have nowhere to map it under fst
and snd. But here that is not a problem, _|_ goes to _|_
under both.

 This is partly an implementation issue --- compiling pattern matching
 without introducing such a lifting requires a parallel implementation

That's interesting. So given a future platform where parallelizing
is much cheaper than it is today, we could conceivably have
a totally lazy version of Haskell. I wonder what it would be like
to program in that environment, what new problems would arise
and what new techniques would solve them. Sounds like a nice
research topic. Who is working on it?

 --- and partly a semantic issue --- data introduces a single level of
 lifting, every time it is used, and every constructor is completely
 lazy.

Unless you use bangs. So both options are available, and that
essentially is what defines Haskell as being a non-strict language.

 Functions have the same issue --- in the presence of seq, undefined /
 = const undefined.

Right. I am becoming increasingly convinced that the seq issue
is a red herring.

 Extensionality is a key part of the definition of all of these
 constructions.  The categorical rules are designed to require, in
 concrete categories, that the range of the two injections into a
 coproduct form a partition of the coproduct, the surjective pairing
 law (fst x, snd x) = x holds, and the eta reduction law (\ x - f x)
 = f holds.  Haskell flaunts all three; while some categories have few
 enough morphisms to get away with this (at least some times), Hask is
 not one of them.

That interpretation is not something that is essential in the notion
of category, only in certain specific examples of categories
that you know. Product and coproduct in any given category -
whether they exist, what they are like if they exist, and what alternative
constructions exist if they do not - reflect the nature of the structure
that the category is modeling.

I am interested in understanding the category Hask that represents
what Haskell really is like as a programming language. Not
under some semantic mapping that includes non-computable
functions, and that forgets some of the interesting structure
that comes from laziness (though that is undoubtably also
very interesting).

-Yitz
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Basic question concerning the category Hask (was: concerning data constructors)

2008-01-06 Thread Yitzchak Gale
I wrote:
 ...it was recently claimed on this list that tuples
 are not products in that category.

Derek Elkins wrote:
 Johnathan has given such a demonstration (and it has been demonstrated
 many times on this list since it's creation, it's well-known).

We're still working on it. I've not been convinced yet.
Sorry about my thickness.

Perhaps this should be on a nice wiki page somewhere.
I tried to convince David House to put it in the Wikibook
chapter, but he is right, that needs to simpler. There is
some discussion on the talk page for that
chapter, but no one spells out the details there, either.

 You are right not to be impressed by such complaints, but you
 misrepresent people's views on this by saying that they worry about
 such problems.

Sorry, I hope I am not misrepresenting anyone. I just
notice that people make assumptions about a so-called
category Hask, derive various conclusions, then mention
that they are not really true. Perhaps it is only me who is
worried.

 As you say (people say), these properties [that Hask is cartesian closed
 to start] would be nice to have and are very convenient to assume which
 is often safe enough.

I'd like to understand better what is true, so that I can understand
what is safe.

  Certainly computer scientists of a categorical
 bent have developed (weaker) notions to use; namely, monoidal,
 pre-monoidal, Freyd, and/or kappa categories and no doubt others.  Using
 these, however, removes some of the allure of using a categorical
 approach.

It would be nice to distill out of that the basics that are needed
to get the properties that we need for day-to-day work in
Haskell.

  Also, there is a Haskell-specific problem at the very get-go.
 The most obvious choice for the categorical composition operator
 assuming the obvious choice for the arrows and objects does not work...
 ...This can
 easily be fixed by making the categorical (.) strict in both arguments
 and there is no formal problem with it being different from Haskell's
 (.), but it certainly is not intuitively appealing.

I'm not sure it's so bad. First of all, not only is it not a formal
problem, it's also not really a practical problem - there will rarely
if ever be any difference between the two when applied in real
programs.

My opinion is that there would not be any problem with using
that version as (.) in the Prelude. Even if we never do, at least
we should then use (.!) when we state the so-called Monad laws.
It bothers me that Haskell's so-called monads really aren't.
That is bound to cause problems.

And it would be so easy to fix in most cases - just require monad
bind to be strict on the second parameter.

Thanks,
Yitz
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Basic question concerning the category Hask (was: concerning data constructors)

2008-01-06 Thread Miguel Mitrofanov

Take the ordinary disjoint union, and then add a new _|_ element,
distinct from both existing copies of _|_ (which are still distinct
from each other).


Now why is that not the category-theoretic coproduct?
h . Left = f and h . Right = g both for _|_ and for finite
elements of the types. And it looks universal to me.


Yeah, but there could be more functions from Either X Y to Z than  
pairs of functions from X to Z and from Y to Z.


For example, if z :: Z, then you have two functions h1 and h2 such  
that h1 . Left = const z and h1 . Right = const z and the same holds  
for h2. Namely,


h1 = const z

h2 = either (const z) (const z)

This functions are different : h1 (_|_) = z while h2 (_|_) = (_|_).  
And if Either X Y was a category-theoretic coproduct, then the  
function from Either X Y to Z would be UNIQUELY determined by it's  
restrictions to X and Y.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Basic question concerning the category Hask (was: concerning data constructors)

2008-01-06 Thread Jonathan Cast

On 6 Jan 2008, at 3:55 AM, Yitzchak Gale wrote:


I wrote:

What goes wrong with finite coproducts? The obvious thing to
do would be to take the disjoint union of the sets representing the
types, identifying the copies of _|_.


Jonathan Cast wrote:

This isn't a coproduct.  If we have f x = 1 and g y = 2, then there
should exist a function h such that h . Left = f and h . Right = g...
But by your rule, Left undefined = Right undefined...
Which is a contradiction...


OK, thanks.


Unless, of course,
your require your functions to be strict --- then both f and g above
become illegal, and repairing them removes the problem.


You don't have to make them illegal - just not part
of your notion of coproduct.


We're talking past each other --- what is the distinction you are  
making?


jcc

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Basic question concerning the category Hask (was: concerning data constructors)

2008-01-06 Thread Jonathan Cast

On 6 Jan 2008, at 5:32 AM, Yitzchak Gale wrote:


(sorry, I hit the send button)


What is the lifted version you are referring to?



Take the ordinary disjoint union, and then add a new _|_ element,
distinct from both existing copies of _|_ (which are still distinct
from each other).


Now why is that not the category-theoretic coproduct?
h . Left = f and h . Right = g both for _|_ and for finite
elements of the types. And it looks universal to me.


Not quite.  The only requirement for h _|_ is that it be = f x for  
all x and = g y for all y.  If f x = 1 = g y for all x, y, then both  
h _|_ = _|_ and h _|_ = 1 are arrows of the category.  So the  
universal property still fails.



 Of course, Haskell makes things even worse by lifting the
product and exponential objects,



OK, what goes wrong there, and what is the lifting?


Again, in Haskell, (_|_, _|_) /= _|_.  The difference is in the  
function

f (x, y) = 1 which gives f undefined = undefined but
f (undefined, undefined) = 1.


I don't get that one. Given any f and g, we define h x = (f x, g x).
Why do we not have fst . h = f and snd . h = g, both in Hask
and StrictHask? fst and snd are strict.


Again, if f and g are both strict, we have a choice for h _|_ ---  
either h _|_ = _|_ or h _|_ = (_|_, _|_) will work (fst . h = f and  
snd . h = g), but again these are different morphisms.





 Unfortunately, this means that (alpha, beta) has an extra _|_
element  (_|_, _|_), so it's not the categorical product (which
would lack such an element for the same reason as above).


The reason you can't adjoin an extra element to (A,B) in, say,
Set, is that you would have nowhere to map it under fst
and snd. But here that is not a problem, _|_ goes to _|_
under both.

This is partly an implementation issue --- compiling pattern  
matching
without introducing such a lifting requires a parallel  
implementation


That's interesting. So given a future platform where parallelizing
is much cheaper than it is today, we could conceivably have
a totally lazy version of Haskell. I wonder what it would be like
to program in that environment, what new problems would arise
and what new techniques would solve them. Sounds like a nice
research topic. Who is working on it?

--- and partly a semantic issue --- data introduces a single  
level of

lifting, every time it is used, and every constructor is completely
lazy.


Unless you use bangs. So both options are available, and that
essentially is what defines Haskell as being a non-strict language.


(!alpha, !beta) isn't a categorical product, either.  snd (undefined,  
1) = undefined with this type.




Functions have the same issue --- in the presence of seq,  
undefined /

= const undefined.


Right. I am becoming increasingly convinced that the seq issue
is a red herring.


Care to give an explanation?




Extensionality is a key part of the definition of all of these
constructions.  The categorical rules are designed to require, in
concrete categories, that the range of the two injections into a
coproduct form a partition of the coproduct, the surjective pairing
law (fst x, snd x) = x holds, and the eta reduction law (\ x - f x)
= f holds.  Haskell flaunts all three; while some categories have  
few
enough morphisms to get away with this (at least some times),  
Hask is

not one of them.


That interpretation is not something that is essential in the notion
of category, only in certain specific examples of categories
that you know.


I understand category theory.  I also know that the definitions used  
are chosen to get Set `right', which means extensionality in that  
case, and are then extended uniformly across all categories.  This  
has the effect of requiring similar constructions to those in Set in  
other concrete categories.



Product and coproduct in any given category -
whether they exist, what they are like if they exist, and what  
alternative
constructions exist if they do not - reflect the nature of the  
structure

that the category is modeling.


I understand that.  I'm not sure you do.



I am interested in understanding the category Hask that represents
what Haskell really is like as a programming language.


Good luck.


Not
under some semantic mapping that includes non-computable
functions, and that forgets some of the interesting structure
that comes from laziness (though that is undoubtably also
very interesting).


Bear in mind in your quest, that at the end of it you'll most likely  
conclude, like everyone else, that good old equational reasoning is  
sound for the programs you actually right at least 90% of the time  
(with appropriate induction principles), and complete for at least  
90% of what you want to right, and go back to using it exclusively  
for real programming.


jcc


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Basic question concerning the category Hask (was: concerning data constructors)

2008-01-05 Thread Yitzchak Gale
Jonathan Cast wrote:
 The normal view taken by Haskellers is that the denotations of
 Haskell types are CPPOs.
  So:
 (1) Must be monotone
 (2) Must be continuous
 (Needn't be strict, even though that messes up the resulting
 category substantially).

I wrote:
 I'm not convinced that the category is all that messed up.

 Well, no coproducts (Haskell uses a lifted version of the coproduct
 from CPO).

What goes wrong with finite coproducts? The obvious thing to
do would be to take the disjoint union of the sets representing the
types, identifying the copies of _|_.

What is the lifted version you are referring to?

  Of course, Haskell makes things even worse by lifting the
 product and exponential objects,

OK, what goes wrong there, and what is the lifting?

Thanks,
Yitz
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Basic question concerning the category Hask (was: concerning data constructors)

2008-01-05 Thread Jonathan Cast

On 5 Jan 2008, at 6:03 PM, Yitzchak Gale wrote:


Jonathan Cast wrote:

The normal view taken by Haskellers is that the denotations of
Haskell types are CPPOs.
 So:
(1) Must be monotone
(2) Must be continuous
(Needn't be strict, even though that messes up the resulting
category substantially).


I wrote:

I'm not convinced that the category is all that messed up.



Well, no coproducts (Haskell uses a lifted version of the coproduct
from CPO).


What goes wrong with finite coproducts? The obvious thing to
do would be to take the disjoint union of the sets representing the
types, identifying the copies of _|_.


This isn't a coproduct.  If we have f x = 1 and g y = 2, then there  
should exist a function h such that h . Left = f and h . Right = g,  
i.e.,


  h (Left x)
= f x
= 1

and
  h (Right y)
= g y
= 2

But by your rule, Left undefined = Right undefined, so
  1
= h (Left undefined)
= h (Right undefined)
= 2

Which is a contradiction.  Identifying Left _|_ and Right _|_  
produces a pointed CPO, but it's not a coproduct.  Unless, of course,  
your require your functions to be strict --- then both f and g above  
become illegal, and repairing them removes the problem.



What is the lifted version you are referring to?


Take the ordinary disjoint union, and then add a new _|_ element,  
distinct from both existing copies of _|_ (which are still distinct  
from each other).





 Of course, Haskell makes things even worse by lifting the
product and exponential objects,


OK, what goes wrong there, and what is the lifting?


Again, in Haskell, (_|_, _|_) /= _|_.  The difference is in the function

f (x, y) = 1

which gives f undefined = undefined but f (undefined, undefined) =  
1.  Unfortunately, this means that (alpha, beta) has an extra _|_  
element  (_|_, _|_), so it's not the categorical product (which  
would lack such an element for the same reason as above).
This is partly an implementation issue --- compiling pattern matching  
without introducing such a lifting requires a parallel implementation  
--- and partly a semantic issue --- data introduces a single level of  
lifting, every time it is used, and every constructor is completely  
lazy.


Functions have the same issue --- in the presence of seq, undefined / 
= const undefined.


Extensionality is a key part of the definition of all of these  
constructions.  The categorical rules are designed to require, in  
concrete categories, that the range of the two injections into a  
coproduct form a partition of the coproduct, the surjective pairing  
law (fst x, snd x) = x holds, and the eta reduction law (\ x - f x)  
= f holds.  Haskell flaunts all three; while some categories have few  
enough morphisms to get away with this (at least some times), Hask is  
not one of them.


jcc

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Basic question concerning the category Hask (was: concerning data constructors)

2008-01-03 Thread Yitzchak Gale
Hi Jonathan,

I wrote:
 So in what way are Set morphisms restricted from being
 Hask morphisms?

Jonathan Cast wrote:
 The normal view taken by Haskellers is that the denotations of
 Haskell types are CPPOs.

CPPO?

  So:

 (1) Must be monotone
 (2) Must be continuous

Could you please define what you mean by those terms
in this context?

 (Needn't be strict, even though that messes up the resulting category
 substantially).

I'm not convinced that the category is all that messed up.

Thanks,
Yitz
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Basic question concerning the category Hask (was: concerning data constructors)

2008-01-03 Thread Yitzchak Gale
Jonathan Cast wrote:
 The normal view taken by Haskellers is that the denotations of
 Haskell types are CPPOs.

I wrote:
 CPPO?

 (1) Must be monotone
 (2) Must be continuous

 Could you please define what you mean by those terms
 in this context?

Jens Blanck wrote:
 The extra P would stand for pointed (has a least element, bottom), this is
 common in some communities. To me though, a cpo (complete partial order) is
 closed under directed suprema and the empty set is directed so bottom is
 already required. The category of cpos in not cartesian closed. For
 denotational semantics I believe the subcategory of Scott domains are what
 is usually considered.

 Continuous functions on cpos are by definition monotone and they respect
 directed suprema.

Thanks!
Yitz
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Basic question concerning the category Hask (was: concerning data constructors)

2008-01-03 Thread Jonathan Cast

On 3 Jan 2008, at 3:40 AM, Jens Blanck wrote:


 The normal view taken by Haskellers is that the denotations of
 Haskell types are CPPOs.

CPPO?

  So:

 (1) Must be monotone
 (2) Must be continuous

Could you please define what you mean by those terms
in this context?

 (Needn't be strict, even though that messes up the resulting  
category

 substantially).

I'm not convinced that the category is all that messed up.


Well, no coproducts (Haskell uses a lifted version of the coproduct  
from CPO).  Of course, Haskell makes things even worse by lifting the  
product and exponential objects, as well, which come to think of it  
is unnecessary even in the category of CPPOs and not necessarily  
strict continuous functions.


The extra P would stand for pointed (has a least element, bottom),  
this is common in some communities. To me though, a cpo (complete  
partial order) is closed under directed suprema and the empty set is  
directed so bottom is already required.


Not so.  A set is directed iff every finite subset has an upper bound  
in the set; {} is finite, so it must have an upper bound in the set.   
So directed sets must be non-empty.  (So CPOs needn't be pointed).


jcc

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Basic question concerning the category Hask (was: concerning data constructors)

2008-01-02 Thread Jonathan Cast

On 2 Jan 2008, at 5:49 AM, Yitzchak Gale wrote:


Hi Andrew,

Andrew Bromage wrote:
I still say it isn't a set in the same way that a group isn't a  
set.

Haskell data types have structure that is respected by Haskell
homomorphisms.  Sets don't.


Ah, that's certainly true. But what is that additional structure?

In categories that have a forgetful functor to Set, the interesting
part of their structure comes from the fact that their
morphisms are only a proper subset of the morphisms
in Set.

So in what way are Set morphisms restricted from being
Hask morphisms?


The normal view taken by Haskellers is that the denotations of  
Haskell types are CPPOs.  So:


(1) Must be monotone
(2) Must be continuous

(Needn't be strict, even though that messes up the resulting category  
substantially).


jcc

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe