Re: [Haskell-cafe] Re: DDC compiler and effects; better than Haskell?

2009-08-16 Thread Artem V. Andreev
"John A. De Goes"  writes:

> I forgot about links. In that case, consider:  getUniqueFilesInDirRecursive.
>
> Attacking irrelevant details in an argument is often called a  "strawman 
> attack". Such attacks are
> pointless because they do not  address the real substance of the issue. My 
> example is easily
> modified  to avoid the issues you raise.
>
> Consider the fact that many file-based operations _can and are  parallelized 
> manually by
> developers_. The challenge for next  generation language and effect system 
> designers is to figure
> out _how_  such operations can be automatically parallelized, given 
> sufficient  constraints,
> high-level constructs, and a powerful effect system.
>
> Saying, "I don't know exactly how it will look," is quite a bit  different 
> from saying "It can't
> be done." I claim the former.
>
> Regards,
I am sorry, but it's not about details, but about the essence. My point was 
that there are a lot
of subtle issues when we're dealing with (e.g.) a file system in a real-world 
manner.  
I have no doubt that it is possible to develop a sound logical system that 
would cover them and
then encode it as a part of the type system of a language. That will probably 
lead to 
compile-time detection of a wider class of errors. But the problem is that one 
(IMHO) cannot 
deal with these subtleties in a generic manner. That is, there are things that 
may be done in
parallel, and there are things that may not -- and it depends on the actual 
task you want to
perform. So basically instead of manually parallelizing something, you'll 
(IMHO) end up writing
complex type annotations so that a compiler could parallelize it on its own 
behalf. 
As somebody who have a certain experience with formal methods, I doubt that the 
latter will ever
be simplier than the former.


> John A. De Goes
> N-Brain, Inc.
> The Evolution of Collaboration
>
> http://www.n-brain.net|877-376-2724 x 101
>
> On Aug 16, 2009, at 12:38 AM, Artem V. Andreev wrote:
>
>> "John A. De Goes"  writes:
>>
>>> On Aug 15, 2009, at 6:36 AM, Jason Dusek wrote:
 2009/08/14 John A. De Goes :
> Hmmm, my point (perhaps I wasn't clear), is that different
> effects have different commutability properties. In the case
> of a file system, you can commute two sequential reads from
> two different files.

 I think this is a bad example -- it's not something that's
 safe in general and discredits your idea. How would the
 compiler even know that two files are not actually the same
 file?
>>>
>>> I don't think the file system is the best example. However, I do  think  
>>> it's a reasonable one.
>>>
>>> Let's say the type of the function getFilesInDir is annotated in  such  a 
>>> way as to tell the
>>> effect
>>> system that every file in the returned  array is unique. Further,  let's 
>>> say the type of the
>>> function  makeNewTempFile is annotated in such a way as to tell the  effect 
>>>  system that the
>>> function will succeed in creating a new temp file with  a name  unique from 
>>> any other existing
>>> file.
>> Sorry, but this example is ridiculuous. While file *names* in this  case 
>> might be reasonably
>> assumed
>> to be unique, the *files* themselves may not. Any modern filesystem  does 
>> support file aliasing,
>> and usually several forms thereof. And what does makeNewTempFile  function 
>> do? Does it create a
>> new
>> file like POSIX mktemp() and return its name, or does it rather  behave as 
>> POSIX mkstemp()?
>> The first case is a well known security hole, and the second case  does not, 
>> as it seems to me,
>> fit
>> well into the rest of your reasoning.
>>
>> However, let's consider further file system tree traversal. In some  cases 
>> you might not care,
>> whether
>> some of the directories you descend into are actually the same  directory, 
>> so your proposed
>> optimization
>> would be `safe'. However, in other cases sequential traversal would  work, 
>> while a parallelized
>> version
>> would not, unless special additional measures are taken. E.g.  consider a 
>> case of a build
>> system. It
>> traverses a source tree, finds sources files and if corresponding  object 
>> files are non-existent
>> or
>> outdated, does something to regenerate them. Now if you have a  directory 
>> that's actually a link
>> to
>> another directory, and you do sequential traversal, everything is  fine: you 
>> descend into the
>> directory
>> the first time, build everything there and when you descend into it  the 
>> second time, there's
>> just nothing
>> to do. If you do parallel traversal, you may well end up in the  situation 
>> where two threads
>> check
>> simultaneously for an object file, discover it's outdated and run  two build 
>> processes
>> simultaneously,
>> with the most likely effect of corrupted object file.
>>
>>
>>> Then if you write a recursive function that loops through all files  in  a 
>>> directory, and for
>>> each
>>> file, it parses an

Re: [Haskell-cafe] Re: DDC compiler and effects; better than Haskell?

2009-08-16 Thread John A. De Goes


I chose this example specifically because parsing/compiling is not IO- 
bound. Many build systems today achieve multi-core scaling by  
parallelizing all the phases: parsing, semantic analysis, and  
compilation.


Your question is a good one and one we face already in auto- 
parallelization of purely functional code: how do you know when the  
cost of doing something in another thread is overwhelmed by the  
benefit? I think JIT compilation provides the ultimate answer to these  
types of questions, because you can make guesses, and if you get them  
wrong, simply try again.


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101

On Aug 16, 2009, at 2:46 AM, Marcin Kosiba wrote:

Hi,
	IMHO, provided with a flexible effect system, the decision on how  
to do

read/write operations on files is a matter of libraries.
	But I'd like to ask another question: is the effects system you're  
discussing
now really capable of providing significant performance improvements  
in case
of file I/O? Even if we assume some consistency model and transform  
one
correct program to another one -- how do you estimate efficiency  
without
knowledge of physical media characteristics? I kinda see how this  
could be
used to optimize different kinds of media access (reordering socket/ 
file
operations or even running some of those in parallel), but I don't  
see how

can we benefit from reordering writes to the same media.
	Another thing is that not every kind of r/w operation requires the  
same
consistency model -- like when I'm writing a backup for later use I  
only care
about my writes being in the same order. I imagine that such an  
effect system

could help write software for CC-NUMA architectures or shared-memory
distributed systems.
--
Thanks!
Marcin Kosiba
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: DDC compiler and effects; better than Haskell?

2009-08-16 Thread John A. De Goes


I forgot about links. In that case, consider:  
getUniqueFilesInDirRecursive.


Attacking irrelevant details in an argument is often called a  
"strawman attack". Such attacks are pointless because they do not  
address the real substance of the issue. My example is easily modified  
to avoid the issues you raise.


Consider the fact that many file-based operations _can and are  
parallelized manually by developers_. The challenge for next  
generation language and effect system designers is to figure out _how_  
such operations can be automatically parallelized, given sufficient  
constraints, high-level constructs, and a powerful effect system.


Saying, "I don't know exactly how it will look," is quite a bit  
different from saying "It can't be done." I claim the former.


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101

On Aug 16, 2009, at 12:38 AM, Artem V. Andreev wrote:


"John A. De Goes"  writes:


On Aug 15, 2009, at 6:36 AM, Jason Dusek wrote:

2009/08/14 John A. De Goes :

Hmmm, my point (perhaps I wasn't clear), is that different
effects have different commutability properties. In the case
of a file system, you can commute two sequential reads from
two different files.


I think this is a bad example -- it's not something that's
safe in general and discredits your idea. How would the
compiler even know that two files are not actually the same
file?


I don't think the file system is the best example. However, I do  
think  it's a reasonable one.


Let's say the type of the function getFilesInDir is annotated in  
such  a way as to tell the effect
system that every file in the returned  array is unique. Further,  
let's say the type of the
function  makeNewTempFile is annotated in such a way as to tell the  
effect  system that the
function will succeed in creating a new temp file with  a name  
unique from any other existing

file.
Sorry, but this example is ridiculuous. While file *names* in this  
case might be reasonably assumed
to be unique, the *files* themselves may not. Any modern filesystem  
does support file aliasing,
and usually several forms thereof. And what does makeNewTempFile  
function do? Does it create a new
file like POSIX mktemp() and return its name, or does it rather  
behave as POSIX mkstemp()?
The first case is a well known security hole, and the second case  
does not, as it seems to me, fit

well into the rest of your reasoning.

However, let's consider further file system tree traversal. In some  
cases you might not care, whether
some of the directories you descend into are actually the same  
directory, so your proposed optimization
would be `safe'. However, in other cases sequential traversal would  
work, while a parallelized version
would not, unless special additional measures are taken. E.g.  
consider a case of a build system. It
traverses a source tree, finds sources files and if corresponding  
object files are non-existent or
outdated, does something to regenerate them. Now if you have a  
directory that's actually a link to
another directory, and you do sequential traversal, everything is  
fine: you descend into the directory
the first time, build everything there and when you descend into it  
the second time, there's just nothing
to do. If you do parallel traversal, you may well end up in the  
situation where two threads check
simultaneously for an object file, discover it's outdated and run  
two build processes simultaneously,

with the most likely effect of corrupted object file.


Then if you write a recursive function that loops through all files  
in  a directory, and for each
file, it parses and compiles the file into a  new temp file, then a  
sufficiently sophisticated
compiler should be  able to safely transform the recursion into  
parallel parsing and  compilation
-- in a way that's provably correct, assuming the original  program  
was correct.


The promise of a language with a purely functional part and a  
powerful  effect system for
everything else is very great. And very important in  the massively  
concurrent world we are

entering.


Well, yes -- which sounds like, there are no guarantees
in general. Something that works half the time leaves you with
two responsibilities -- the old responsibility of the work you
did when you didn't have it and the new responsibility of
knowing when it applies and when it doesn't.


In the other thread, I brought up the example of buffering reads.   
Library authors make the
decision to buffer for one reason: because if  some other program  
is messing with the data, you're

screwed no matter  what.

And yeah, "they might be screwing with the data in just the way  
you  need it to be screwed with,"
(Sebastian), in which case my advice is  use C and hope for the  
best. :-)


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101


__

Re: [Haskell-cafe] Re: DDC compiler and effects; better than Haskell?

2009-08-16 Thread Marcin Kosiba
On Sunday 16 August 2009, Artem V. Andreev wrote:
> "John A. De Goes"  writes:
> > On Aug 15, 2009, at 6:36 AM, Jason Dusek wrote:
> >> 2009/08/14 John A. De Goes :
> >>> Hmmm, my point (perhaps I wasn't clear), is that different
> >>> effects have different commutability properties. In the case
> >>> of a file system, you can commute two sequential reads from
> >>> two different files.
> >>
> >>  I think this is a bad example -- it's not something that's
> >>  safe in general and discredits your idea. How would the
> >>  compiler even know that two files are not actually the same
> >>  file?
> >
> > I don't think the file system is the best example. However, I do think 
> > it's a reasonable one.
> >
> > Let's say the type of the function getFilesInDir is annotated in such  a
> > way as to tell the effect system that every file in the returned  array
> > is unique. Further, let's say the type of the function  makeNewTempFile
> > is annotated in such a way as to tell the effect  system that the
> > function will succeed in creating a new temp file with  a name unique
> > from any other existing file.
>
> Sorry, but this example is ridiculuous. While file *names* in this case
> might be reasonably assumed to be unique, the *files* themselves may not.
> Any modern filesystem does support file aliasing, and usually several forms
> thereof. And what does makeNewTempFile function do? Does it create a new
> file like POSIX mktemp() and return its name, or does it rather behave as
> POSIX mkstemp()? The first case is a well known security hole, and the
> second case does not, as it seems to me, fit well into the rest of your
> reasoning.

Hi,
IMHO, provided with a flexible effect system, the decision on how to do 
read/write operations on files is a matter of libraries.
But I'd like to ask another question: is the effects system you're 
discussing 
now really capable of providing significant performance improvements in case 
of file I/O? Even if we assume some consistency model and transform one 
correct program to another one -- how do you estimate efficiency without 
knowledge of physical media characteristics? I kinda see how this could be 
used to optimize different kinds of media access (reordering socket/file 
operations or even running some of those in parallel), but I don't see how 
can we benefit from reordering writes to the same media.
Another thing is that not every kind of r/w operation requires the same 
consistency model -- like when I'm writing a backup for later use I only care 
about my writes being in the same order. I imagine that such an effect system 
could help write software for CC-NUMA architectures or shared-memory 
distributed systems.
-- 
Thanks!
Marcin Kosiba


signature.asc
Description: This is a digitally signed message part.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: DDC compiler and effects; better than Haskell?

2009-08-15 Thread Artem V. Andreev
"John A. De Goes"  writes:

> On Aug 15, 2009, at 6:36 AM, Jason Dusek wrote:
>> 2009/08/14 John A. De Goes :
>>> Hmmm, my point (perhaps I wasn't clear), is that different
>>> effects have different commutability properties. In the case
>>> of a file system, you can commute two sequential reads from
>>> two different files.
>>
>>  I think this is a bad example -- it's not something that's
>>  safe in general and discredits your idea. How would the
>>  compiler even know that two files are not actually the same
>>  file?
>
> I don't think the file system is the best example. However, I do think  it's 
> a reasonable one.
>
> Let's say the type of the function getFilesInDir is annotated in such  a way 
> as to tell the effect
> system that every file in the returned  array is unique. Further, let's say 
> the type of the
> function  makeNewTempFile is annotated in such a way as to tell the effect  
> system that the
> function will succeed in creating a new temp file with  a name unique from 
> any other existing
> file.
Sorry, but this example is ridiculuous. While file *names* in this case might 
be reasonably assumed 
to be unique, the *files* themselves may not. Any modern filesystem does 
support file aliasing,
and usually several forms thereof. And what does makeNewTempFile function do? 
Does it create a new
file like POSIX mktemp() and return its name, or does it rather behave as POSIX 
mkstemp()? 
The first case is a well known security hole, and the second case does not, as 
it seems to me, fit
well into the rest of your reasoning.

However, let's consider further file system tree traversal. In some cases you 
might not care, whether
some of the directories you descend into are actually the same directory, so 
your proposed optimization
would be `safe'. However, in other cases sequential traversal would work, while 
a parallelized version
would not, unless special additional measures are taken. E.g. consider a case 
of a build system. It
traverses a source tree, finds sources files and if corresponding object files 
are non-existent or
outdated, does something to regenerate them. Now if you have a directory that's 
actually a link to
another directory, and you do sequential traversal, everything is fine: you 
descend into the directory
the first time, build everything there and when you descend into it the second 
time, there's just nothing
to do. If you do parallel traversal, you may well end up in the situation where 
two threads check
simultaneously for an object file, discover it's outdated and run two build 
processes simultaneously,
with the most likely effect of corrupted object file.


> Then if you write a recursive function that loops through all files in  a 
> directory, and for each
> file, it parses and compiles the file into a  new temp file, then a 
> sufficiently sophisticated
> compiler should be  able to safely transform the recursion into parallel 
> parsing and  compilation
> -- in a way that's provably correct, assuming the original  program was 
> correct.
>
> The promise of a language with a purely functional part and a powerful  
> effect system for
> everything else is very great. And very important in  the massively 
> concurrent world we are
> entering.
>
>>  Well, yes -- which sounds like, there are no guarantees
>>  in general. Something that works half the time leaves you with
>>  two responsibilities -- the old responsibility of the work you
>>  did when you didn't have it and the new responsibility of
>>  knowing when it applies and when it doesn't.
>
> In the other thread, I brought up the example of buffering reads.  Library 
> authors make the
> decision to buffer for one reason: because if  some other program is messing 
> with the data, you're
> screwed no matter  what.
>
> And yeah, "they might be screwing with the data in just the way you  need it 
> to be screwed with,"
> (Sebastian), in which case my advice is  use C and hope for the best. :-)
>
> Regards,
>
> John A. De Goes
> N-Brain, Inc.
> The Evolution of Collaboration
>
> http://www.n-brain.net|877-376-2724 x 101
>
>
> ___
> Haskell-Cafe mailing list
> Haskell-Cafe@haskell.org
> http://www.haskell.org/mailman/listinfo/haskell-cafe
>

-- 

S. Y. A(R). A.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: DDC compiler and effects; better than Haskell?

2009-08-14 Thread Henning Thielemann


On Thu, 13 Aug 2009, Heinrich Apfelmus wrote:


Russell O'Connor wrote:

Peter Verswyvelen wrote:


I kind of agree with the DDC authors here; in Haskell as soon as a
function has a side effect, and you want to pass that function to a
pure higher order function, you're stuck, you need to pick the monadic
version of the higher order function, if it exists. So Haskell doesn't
really solve the modularity problem, you need two versions of each
higher order function really,


Actually you need five versions: The pure version, the pre-order
traversal, the post-order traversal, the in-order traversal, and the
reverse in-order traversal.  And that is just looking at syntax.  If you
care about your semantics you could potentially have more (or less).


Exactly! There is no unique choice for the order of effects when lifting
a pure function to an effectful one.


Which proves the proposition (courtesy of Johannes Waldmann), again:
 If a problem is more difficult to solve in Haskell than in another 
language, this is not Haskell's fault, but it is because the tackled 
problem is difficult and the other language only hides the problem. :-)

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: DDC compiler and effects; better than Haskell?

2009-08-13 Thread roconnor

On Thu, 13 Aug 2009, rocon...@theorem.ca wrote:

Actually you need five versions: The pure version, the pre-order traversal, 
the post-order traversal, the in-order traversal, and the reverse in-order 
traversal.  And that is just looking at syntax.  If you care about your 
semantics you could potentially have more (or less).


Minor technical correction.  The four syntactic traversals are: pre-order, 
post-order, reverse pre-order, and reverse-post order.  The in-order and 
reverse in-order are examples of other semantic traversals specific to 
binary tree like structures.


--
Russell O'Connor  
``All talk about `theft,''' the general counsel of the American Graphophone
Company wrote, ``is the merest claptrap, for there exists no property in
ideas musical, literary or artistic, except as defined by statute.''
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


RE: [Haskell-cafe] Re: DDC compiler and effects; better than Haskell?

2009-08-13 Thread Sittampalam, Ganesh
What would preOrder foldr/foldl mean? What about preOrder (reverse . map) and 
preOrder (map . reverse) ?

Another option would be for map to take a "strategy" as a parameter, sort of 
like Control.Parallel.Strategies.

Peter Verswyvelen wrote:
> Well, in DDC I believe the order is left to right.
> 
> But you guys are right, many orders exist.
> 
> On the other hand, a language might offer primitives to convert
> pure-to-effectfull functions no, in which you indicate the order you
> want.  
> 
> e.g. preOrder map
> 
> No?
> 
> (anyway Oleg's reply seems to give a definite answer to this thread
> no? :-) 
> 
> On Thu, Aug 13, 2009 at 11:06 AM, Heinrich
> Apfelmus wrote: 
>> Russell O'Connor wrote:
>>> Peter Verswyvelen wrote:
>>> 
 I kind of agree with the DDC authors here; in Haskell as soon as a
 function has a side effect, and you want to pass that function to a
 pure higher order function, you're stuck, you need to pick the
 monadic version of the higher order function, if it exists. So
 Haskell doesn't really solve the modularity problem, you need two
 versions of each higher order function really,
>>> 
>>> Actually you need five versions: The pure version, the pre-order
>>> traversal, the post-order traversal, the in-order traversal, and the
>>> reverse in-order traversal.  And that is just looking at syntax.  If
>>> you care about your semantics you could potentially have more (or
>>> less). 
>> 
>> Exactly! There is no unique choice for the order of effects when
>> lifting a pure function to an effectful one.
>> 
>> For instance, here two different versions of an effectful  map :
>> 
>>   mapM f []     = return []
>>   mapM f (x:xs) = do
>>       y  <- f x
>>       ys <- mapM f xs
>>       return (y:ys)
>> 
>>   mapM2 f []     = return []
>>   mapM2 f (x:xs) = do
>>       ys <- mapM2 f xs
>>       y  <- f x
>>       return (y:ys)
>> 
>> Which one will the DCC compiler chose, given
>> 
>>   map f []     = []
>>   map f (x:xs) = f x : map f xs
>> 
>> ? Whenever I write a pure higher order function, I'd also have to
>> document the order of effects.
>> 
>> 
>> Regards,
>> apfelmus
>> 
>> --
>> http://apfelmus.nfshost.com
>> 
>> ___
>> Haskell-Cafe mailing list
>> Haskell-Cafe@haskell.org
>> http://www.haskell.org/mailman/listinfo/haskell-cafe
>> 
> ___
> Haskell-Cafe mailing list
> Haskell-Cafe@haskell.org
> http://www.haskell.org/mailman/listinfo/haskell-cafe


=== 
 Please access the attached hyperlink for an important electronic 
communications disclaimer: 
 http://www.credit-suisse.com/legal/en/disclaimer_email_ib.html 
 
=== 
 
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: DDC compiler and effects; better than Haskell?

2009-08-13 Thread Ben Lippmeier

Heinrich Apfelmus wrote:

Actually you need five versions: The pure version, the pre-order
traversal, the post-order traversal, the in-order traversal, and the
reverse in-order traversal.  And that is just looking at syntax.  If you
care about your semantics you could potentially have more (or less).



Exactly! There is no unique choice for the order of effects when lifting
a pure function to an effectful one.

For instance, here two different versions of an effectful  map :

   mapM f [] = return []
   mapM f (x:xs) = do
   y  <- f x
   ys <- mapM f xs
   return (y:ys)

   mapM2 f [] = return []
   mapM2 f (x:xs) = do
   ys <- mapM2 f xs
   y  <- f x
   return (y:ys)

Which one will the DCC compiler chose, given

   map f [] = []
   map f (x:xs) = f x : map f xs
  

Disciple uses default strict, left to right evaluation order. For
the above map function, if "f" has any effects they will be executed
in the same order as the list elements.


? Whenever I write a pure higher order function, I'd also have to
document the order of effects.
  


If you write a straight-up higher-order function like map above,
then it's neither pure or impure. Rather, it's polymorphic in the
effect of its argument function. When effect information is
added to the type of map it becomes:


map :: forall a b %r1 %r2 !e1
   .  (a -(!e1)> b) -> List %r1 a -(!e2)> List %r2 b
   :- !e2 = !{ !Read %r1; !e1 }


Which says the effect of evaluating map is to read the list and
do whatever the argument function does. If the argument function
is pure, and the input list is constant, then the application
of map is pure, otherwise not.

If you want to define an "always-pure" version of map, which
only accepts pure argument functions then you can give it the
signature:

pureMap :: (a -(!e1)> b) -> List %r1 a -> List %r2 b
   :- Pure !e1, Const %r1

.. and use the same definition as before.

Note that you don't have to specify the complete type in the
source language, only the bits you care about - the rest is
inferred.

Now if you try to pass pureMap an impure function, you get
an effect typing error.

Adding purity constraints allows you to write H.O functions
without committing to an evaluation order, so you can change
it later if desired.


Ben.


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] Re: DDC compiler and effects; better than Haskell?

2009-08-13 Thread Peter Verswyvelen
Well, in DDC I believe the order is left to right.

But you guys are right, many orders exist.

On the other hand, a language might offer primitives to convert
pure-to-effectfull functions no, in which you indicate the order you
want.

e.g. preOrder map

No?

(anyway Oleg's reply seems to give a definite answer to this thread no? :-)

On Thu, Aug 13, 2009 at 11:06 AM, Heinrich
Apfelmus wrote:
> Russell O'Connor wrote:
>> Peter Verswyvelen wrote:
>>
>>> I kind of agree with the DDC authors here; in Haskell as soon as a
>>> function has a side effect, and you want to pass that function to a
>>> pure higher order function, you're stuck, you need to pick the monadic
>>> version of the higher order function, if it exists. So Haskell doesn't
>>> really solve the modularity problem, you need two versions of each
>>> higher order function really,
>>
>> Actually you need five versions: The pure version, the pre-order
>> traversal, the post-order traversal, the in-order traversal, and the
>> reverse in-order traversal.  And that is just looking at syntax.  If you
>> care about your semantics you could potentially have more (or less).
>
> Exactly! There is no unique choice for the order of effects when lifting
> a pure function to an effectful one.
>
> For instance, here two different versions of an effectful  map :
>
>   mapM f []     = return []
>   mapM f (x:xs) = do
>       y  <- f x
>       ys <- mapM f xs
>       return (y:ys)
>
>   mapM2 f []     = return []
>   mapM2 f (x:xs) = do
>       ys <- mapM2 f xs
>       y  <- f x
>       return (y:ys)
>
> Which one will the DCC compiler chose, given
>
>   map f []     = []
>   map f (x:xs) = f x : map f xs
>
> ? Whenever I write a pure higher order function, I'd also have to
> document the order of effects.
>
>
> Regards,
> apfelmus
>
> --
> http://apfelmus.nfshost.com
>
> ___
> Haskell-Cafe mailing list
> Haskell-Cafe@haskell.org
> http://www.haskell.org/mailman/listinfo/haskell-cafe
>
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: DDC compiler and effects; better than Haskell?

2009-08-13 Thread Heinrich Apfelmus
Russell O'Connor wrote:
> Peter Verswyvelen wrote:
> 
>> I kind of agree with the DDC authors here; in Haskell as soon as a
>> function has a side effect, and you want to pass that function to a
>> pure higher order function, you're stuck, you need to pick the monadic
>> version of the higher order function, if it exists. So Haskell doesn't
>> really solve the modularity problem, you need two versions of each
>> higher order function really,
> 
> Actually you need five versions: The pure version, the pre-order
> traversal, the post-order traversal, the in-order traversal, and the
> reverse in-order traversal.  And that is just looking at syntax.  If you
> care about your semantics you could potentially have more (or less).

Exactly! There is no unique choice for the order of effects when lifting
a pure function to an effectful one.

For instance, here two different versions of an effectful  map :

   mapM f [] = return []
   mapM f (x:xs) = do
   y  <- f x
   ys <- mapM f xs
   return (y:ys)

   mapM2 f [] = return []
   mapM2 f (x:xs) = do
   ys <- mapM2 f xs
   y  <- f x
   return (y:ys)

Which one will the DCC compiler chose, given

   map f [] = []
   map f (x:xs) = f x : map f xs

? Whenever I write a pure higher order function, I'd also have to
document the order of effects.


Regards,
apfelmus

--
http://apfelmus.nfshost.com

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: DDC compiler and effects; better than Haskell? (was Re: unsafeDestructiveAssign?)

2009-08-13 Thread Heinrich Apfelmus
John A. De Goes wrote:
> So what, because effect systems might not eliminate *all* boilerplate,
> you'd rather use boilerplate 100% of the time? :-)

The thing is that you still need  mapM  and friends for all those
effects (like non-determinism) that are not baked into the language.


Regards,
apfelmus

--
http://apfelmus.nfshost.com

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: DDC compiler and effects; better than Haskell?

2009-08-12 Thread roconnor

On Wed, 12 Aug 2009, Peter Verswyvelen wrote:


I kind of agree with the DDC authors here; in Haskell as soon as a
function has a side effect, and you want to pass that function to a
pure higher order function, you're stuck, you need to pick the monadic
version of the higher order function, if it exists. So Haskell doesn't
really solve the modularity problem, you need two versions of each
higher order function really,


Actually you need five versions: The pure version, the pre-order 
traversal, the post-order traversal, the in-order traversal, and the 
reverse in-order traversal.  And that is just looking at syntax.  If you 
care about your semantics you could potentially have more (or less).


--
Russell O'Connor  
``All talk about `theft,''' the general counsel of the American Graphophone
Company wrote, ``is the merest claptrap, for there exists no property in
ideas musical, literary or artistic, except as defined by statute.''
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: [Haskell-cafe] RE: DDC compiler and effects;better than Haskell?

2009-08-12 Thread Peter Verswyvelen
Yes, but HaRe *is* extremely clever :-)

I just wish I could use it, but since it doesn't support many GHC
extensions, I haven't.

On Wed, Aug 12, 2009 at 11:26 AM, Bayley,
Alistair wrote:
>> From: haskell-cafe-boun...@haskell.org
>> [mailto:haskell-cafe-boun...@haskell.org] On Behalf Of Pavel Perikov
>>
>> > unless you have some insanely
>> > clever refactoring tool ready that can convert pure into monadic
>> > functions and vice versa.
>>
>> Not trying to attack the idea, just some thoughts:
>>
>> I don't see much problem converting pure function into an
>> "effectfull"
>> form :) Having pure function
>> myPureFunction::a1->a2->a3->res
>> 
>> myPureFunction a1 a2 a3 -- pure call
>> myPureFunction <$> a1 <*> a2 <*> a3 -- call using Applicative
>>
>> Probably, introducing some conventions and using some language
>> extensions you can write a function taking pure function and
>> converting it into monadic version (a-la SPJ' work on polyvariadic
>> composition etc [1])
>>
>> Have monadic function but needs to call it from pure code? use
>> Control.Monad.Identity.
>>
>> Not a big deal to me but maybe I'm missing the point
>
>
> In the HaRe catalogue (
> http://www.cs.kent.ac.uk/projects/refactor-fp/catalogue/ ) there exists
> a "Monadification" refactoring:
>
> http://www.cs.kent.ac.uk/projects/refactor-fp/catalogue/Monadification1.
> html
>
> Alistair
> *
> Confidentiality Note: The information contained in this message,
> and any attachments, may contain confidential and/or privileged
> material. It is intended solely for the person(s) or entity to
> which it is addressed. Any review, retransmission, dissemination,
> or taking of any action in reliance upon this information by
> persons or entities other than the intended recipient(s) is
> prohibited. If you received this in error, please contact the
> sender and delete the material from any computer.
> *
>
> ___
> Haskell-Cafe mailing list
> Haskell-Cafe@haskell.org
> http://www.haskell.org/mailman/listinfo/haskell-cafe
>
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] RE: DDC compiler and effects;better than Haskell?

2009-08-12 Thread Bayley, Alistair
> From: haskell-cafe-boun...@haskell.org 
> [mailto:haskell-cafe-boun...@haskell.org] On Behalf Of Pavel Perikov
> 
> > unless you have some insanely
> > clever refactoring tool ready that can convert pure into monadic
> > functions and vice versa.
> 
> Not trying to attack the idea, just some thoughts:
> 
> I don't see much problem converting pure function into an 
> "effectfull"  
> form :) Having pure function
> myPureFunction::a1->a2->a3->res
> 
> myPureFunction a1 a2 a3 -- pure call
> myPureFunction <$> a1 <*> a2 <*> a3 -- call using Applicative
> 
> Probably, introducing some conventions and using some language  
> extensions you can write a function taking pure function and  
> converting it into monadic version (a-la SPJ' work on polyvariadic  
> composition etc [1])
> 
> Have monadic function but needs to call it from pure code? use  
> Control.Monad.Identity.
> 
> Not a big deal to me but maybe I'm missing the point


In the HaRe catalogue (
http://www.cs.kent.ac.uk/projects/refactor-fp/catalogue/ ) there exists
a "Monadification" refactoring:
 
http://www.cs.kent.ac.uk/projects/refactor-fp/catalogue/Monadification1.
html

Alistair
*
Confidentiality Note: The information contained in this message,
and any attachments, may contain confidential and/or privileged
material. It is intended solely for the person(s) or entity to
which it is addressed. Any review, retransmission, dissemination,
or taking of any action in reliance upon this information by
persons or entities other than the intended recipient(s) is
prohibited. If you received this in error, please contact the
sender and delete the material from any computer.
*

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe