On Mon 27 Sep, Kevin Hammond wrote:
> At 1:08 pm +0100 28/9/99, Adrian Hey wrote:
> >What worries me here is that there might be
> >some unjustifiable assumptions about the nature of the machine which is
> >performing those operations. In particular by talking about things
> >like 'reduction order' and 'order of pattern matching' aren't we
> >assuming a sequential machine.
> 
> Absolutely NOT (at least for pattern-matching, and note that Haskell is
> non-strict rather than normal-order!).

I think I'm beginning to understand the difference between non-strict
and lazy (normal order) semantics. But unless I've completely lost the
plot regarding Haskell semantics, I would say it still contains implicit
assumptions about the order of evaluation, (unless current Haskell
implementations don't obey Haskell semantics).

> This  is the semantic distinction between sequentiality and serialism [*] --
> serialism simply defines a final ordering of operations, whereas
> sequentiality implies
> a total ordering.   It's entirely possible to have a parallel
> implementation of a language
> defining serial pattern matching [**], but in which the actual execution is
> parallel.

Yes, I think I understand. You mean the implementation is parallel for
efficiency reasons but it still behaves as if it were serial.

> Of course, depending on your perspective, a fully parallel semantics for
> e.g. pattern matching may be better,
> but it can make programs less intuitive or less compact.  For example, given:
> 
> f 0 n = 1
> f n 0 = 2
> f m n = 3
> 
> What is f 0 0?  

f 0 0 = 1

> What if I change the order of the rules?  

You get a different function.

> Do you really want to have to write:
> 
> f m n | m == 0 && n != 0 = 1
>       | m != 0 && n == 0 = 2
>       | m != 0 && n != 0 = 3

No, I like the current rules for dealing with overlapping patterns just fine.
But there's a difference between the rules used to disambiguate overlapping
patterns in the 'syntactic sugar', (which are conceptually sequential,
serial or whatever) and the semantics of the resulting case expressions
(which really are serial).

By this I mean you could keep the current rules for disambiguation, but
do the resulting un-abiguious pattern matching with something other than
case expressions. 

Current Haskell semantics specify that when reducing am expression of form..
    case e of ..
e gets reduced. (The definition of seq relies on this if I remember).
So, for example, the transformation..
 (case e of
  True  -> e'
  False -> e') = e' is not allowed by Haskell semantics.
So if e=_|_, case e of .. = _|_.

So I don't understand how you can say 'Absolutely NOT'.
Is the behaviour of && unspecified with Haskell as it is?

I think it's..
  True  && True  = True
  True  && False = False
  False && True  = False
  False && False = False
  _|_   && True  = _|_
  _|_   && False = _|_
  True  && _|_   = _|_
  False && _|_   = False
  _|_   && _|_   = _|_

So (a && b) = (b && a) is invalid

wheras with what I called 'concurrent non-strict semantics' it should be..
  True  && True  = True
  True  && False = False
  False && True  = False
  False && False = False
  _|_   && True  = _|_
  _|_   && False = False  -- This is different
  True  && _|_   = _|_
  False && _|_   = False
  _|_   && _|_   = _|_

So (a && b) = (b && a) is valid

Regards
-- 
Adrian Hey




Reply via email to