Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-16 Thread John A. De Goes

On Aug 15, 2009, at 5:32 PM, Sebastian Sylvan wrote:
On Sun, Aug 16, 2009 at 12:18 AM, John A. De Goes j...@n-brain.net  
wrote:
You must think I'm arguing for some kind of low-level analog of C,  
augmented with an effect system. I'm not. You can't do that.


No, I don't. I think you're arguing for making access to mutable  
state commutative. Are you not?


There are many cases when mutation to state _is_ commutative. I can't  
argue that certain operations are _always_ commutative without talking  
about the language.


Pretend I'm arguing for a mostly functional language and effect system  
that maximize the opportunities for parallelizing code.


 I'm not saying you shouldn't parallelise them in very specific  
circumstances *where it's safe*, I'm just saying that you shouldn't  
assume that it's safe unless you know it is. If you want to do a  
transformation that's unsafe in general, but safe in a specific  
circumstance, then of course, go ahead!
To my reading it seems like you're arguing that memory/file access  
should *always* be considered commutative though, which is what I'm  
objecting too.


In the right language, many times of memory (and possibly file)  
operations _always_ commute. In the wrong language, they _sometimes_  
commute or _never_ provably commute. I'm not arguing for the  
assumption in any language where it is false.


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-16 Thread Sebastian Sylvan
On Sun, Aug 16, 2009 at 2:50 PM, John A. De Goes j...@n-brain.net wrote:

 On Aug 15, 2009, at 5:32 PM, Sebastian Sylvan wrote:

 On Sun, Aug 16, 2009 at 12:18 AM, John A. De Goes j...@n-brain.net
  wrote:

 You must think I'm arguing for some kind of low-level analog of C,
 augmented with an effect system. I'm not. You can't do that.


 No, I don't. I think you're arguing for making access to mutable state
 commutative. Are you not?


 There are many cases when mutation to state _is_ commutative. I can't argue
 that certain operations are _always_ commutative without talking about the
 language.

 Pretend I'm arguing for a mostly functional language and effect system that
 maximize the opportunities for parallelizing code.

  I'm not saying you shouldn't parallelise them in very specific
 circumstances *where it's safe*, I'm just saying that you shouldn't assume
 that it's safe unless you know it is. If you want to do a transformation
 that's unsafe in general, but safe in a specific circumstance, then of
 course, go ahead!
 To my reading it seems like you're arguing that memory/file access should
 *always* be considered commutative though, which is what I'm objecting too.


 In the right language, many times of memory (and possibly file) operations
 _always_ commute. In the wrong language, they _sometimes_ commute or _never_
 provably commute. I'm not arguing for the assumption in any language where
 it is false.


Well now I'm confused. Earlier you said:
In the case of a file system, you can commute two sequential reads from two
different files. This has no effect on the result of the computation,
assuming no interference from other programs -- and if there _is_
interference from other programs, then guarantees go out the window, _with
or without_ commuting.

It's the assuming no interference form other programs that bugs me,
because when you're reading outside data you kind of have to assume that
there *is* outside interference because you can't really protect yourself
against it. Furthermore, that interference is often more like
cooperation/communication so you're actually *counting* on outside
interference.
E.g. consider durable storage that have to survive random reboots etc., I'm
sure there are quite a few very carefully considered sequential steps that
need to happen in just the right order to get a consistent view of the data
when you read it back in.

That earlier quote seems to imply that you're arguing for just treating all
file reads as commutative and just ignoring that this is an unsafe
assumption. If you no longer think this then I guess we're in agreement.
My point is that *if* there is any chance for outside access to anything
you're reading, then ordering *does* matter. This is the case for file
reads, and may be the case for memory reads. For the latter the compiler
could *potentially* figure out when memory can't be touched by other threads
and make them commute, but I still think the semantics for mutable code
should be sequential (since it unifies the two scenarios), and then the
compiler might make them commutative in scenarios where it's guaranteed to
be safe.

For file reads, I don't think there's a way of knowing that two file reads
are independent, especially since this dependency might live *outside* the
program (e.g. the dependency might only exist for the human reading the
output of the program). So really, if there's any chance something else
might touch your data, the only reasonably safe way to deal with it is to
enforce sequentiality.

-- 
Sebastian Sylvan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-16 Thread Richard O'Keefe

On Aug 15, 2009, at 2:55 PM, John A. De Goes wrote:
If you don't like the file system, consider mutable memory. An  
effect system will tell me I can safely update two pieces of non- 
overlapping, contiguous memory concurrently, even in different  
threads if the complexity so justifies it. The IO monad is a poor  
man's solution to the problem of effects.


In the presence of out-of-order memory writes, write buffers,
caches, c, I'm not going to contradict you, but it certainly
isn't as _obvious_ as it used to be.

I've come across an experimental microprocessor where I _think_
it isn't safe if you have

+-
 data 1 | data 2
++
  cache
  line

whether that's so on any other processor is beyond my expertise.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-16 Thread John A. De Goes


In the presence of _uncontrolled concurrency_, you are correct, but  
uncontrolled concurrency is a failed paradigm littered with defective  
software.


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101

On Aug 16, 2009, at 5:32 PM, Richard O'Keefe wrote:


On Aug 15, 2009, at 2:55 PM, John A. De Goes wrote:
If you don't like the file system, consider mutable memory. An  
effect system will tell me I can safely update two pieces of non- 
overlapping, contiguous memory concurrently, even in different  
threads if the complexity so justifies it. The IO monad is a poor  
man's solution to the problem of effects.


In the presence of out-of-order memory writes, write buffers,
caches, c, I'm not going to contradict you, but it certainly
isn't as _obvious_ as it used to be.

I've come across an experimental microprocessor where I _think_
it isn't safe if you have

+-
data 1 | data 2
   ++
 cache
  line

whether that's so on any other processor is beyond my expertise.



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-15 Thread Jason Dusek
2009/08/14 John A. De Goes j...@n-brain.net:
 Hmmm, my point (perhaps I wasn't clear), is that different
 effects have different commutability properties. In the case
 of a file system, you can commute two sequential reads from
 two different files.

  I think this is a bad example -- it's not something that's
  safe in general and discredits your idea. How would the
  compiler even know that two files are not actually the same
  file?

  However, the idea that a programmer can specify safely
  commuting effects is worthwhile. One could operate in a
  different files are different IO monad where the compiler
  assumes that reads on files with different names are
  commutable.

 This has no effect on the result of the computation, assuming
 no interference from other programs -- and if there _is_
 interference from other programs, then guarantees go out the
 window, _with or without_ commuting.

  Well, yes -- which sounds like, there are no guarantees
  in general. Something that works half the time leaves you with
  two responsibilities -- the old responsibility of the work you
  did when you didn't have it and the new responsibility of
  knowing when it applies and when it doesn't.

--
Jason Dusek
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-15 Thread John A. De Goes

On Aug 14, 2009, at 9:07 PM, Sebastian Sylvan wrote:

That's a separate issue. The problem is that if you *do* depend on  
outside interference, then the sequence of operations matters.


You're example is highly contrived. Were I designing an effect system,  
I would not design for programs that require outside interference  
through a medium as uncontrolled as the file system, because (1) if  
there are applications requiring such measures, they are few and far  
between, and (2) you cannot make any guarantees about the correctness  
of programs depending on interference through an uncontrolled medium.


Effect system optimizations are about taking programs that are  
correct, and transforming them to faster but equivalent programs that  
are still correct.


That said, your reasoning precludes the use of file read buffering,  
and other similar operations that are routinely done. It's only an  
illusion that such programs are safe, with or without transformation  
of sequential read operations.


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-15 Thread John A. De Goes

On Aug 14, 2009, at 9:34 PM, Sebastian Sylvan wrote:

On Sat, Aug 15, 2009 at 3:55 AM, John A. De Goes j...@n-brain.net  
wrote:
If you don't like the file system, consider mutable memory. An  
effect system will tell me I can safely update two pieces of non- 
overlapping, contiguous memory concurrently, even in different  
threads if the complexity so justifies it.


I'd like to point out that this relaxation of sequencing for memory  
operations is already in effect in C on many CPUs. Even though you  
write things sequentially, it doesn't actually happen sequentially  
unless you explicitly say so with memory barriers. This causes  
massive head-aches and horrible bugs that are almost impossible to  
track down whenever you actually do depend on the order (usually in  
multi-threading scenarios, e.g. lockless data structures).


That's because C has no effect system and is too low-level for an  
effect system. That's no argument against one in a high-level language  
similar in syntax to Haskell.


The point is, the safer option is to enforce a sequential model  
(like Haskell does), since that way you can always rely on ordering  
even if you don't even realise you need to, and there's plenty of  
practical experience indicating that the other option (explicit  
barriers to indicate when something isn't commutative) is sheer  
madness.


Your point about safety in C has no relation to safety in a functional  
language with a sophisticated effect system. Haskell enforces a  
sequential model not because it's safer (it's NOT), but because it's  
simpler and because it's the best Haskell monads can do.


Unfortunately, it doesn't fly in a world with dozens of cores.

Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-15 Thread Sebastian Sylvan
On Sat, Aug 15, 2009 at 11:54 PM, John A. De Goes j...@n-brain.net wrote:

 On Aug 14, 2009, at 9:34 PM, Sebastian Sylvan wrote:

 On Sat, Aug 15, 2009 at 3:55 AM, John A. De Goes j...@n-brain.net wrote:

 If you don't like the file system, consider mutable memory. An effect
 system will tell me I can safely update two pieces of non-overlapping,
 contiguous memory concurrently, even in different threads if the complexity
 so justifies it.


 I'd like to point out that this relaxation of sequencing for memory
 operations is already in effect in C on many CPUs. Even though you write
 things sequentially, it doesn't actually happen sequentially unless you
 explicitly say so with memory barriers. This causes massive head-aches and
 horrible bugs that are almost impossible to track down whenever you actually
 do depend on the order (usually in multi-threading scenarios, e.g. lockless
 data structures).


 That's because C has no effect system and is too low-level for an effect
 system. That's no argument against one in a high-level language similar in
 syntax to Haskell.

...


 Your point about safety in C has no relation to safety in a functional
 language with a sophisticated effect system.


I'm sorry, but I think it does. You're advocating that modifications to
mutable state shouldn't have sequential semantics, I'm pointing out that
this is the case today in C on many CPUs and it's a royal pain to work with
in practice (causing many almost-impossible-to-debug crashes). I would not
want functional languages to adopt something that's proven to be
insanity-inducingly difficult to use.


-- 
Sebastian Sylvan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-15 Thread John A. De Goes

On Aug 15, 2009, at 6:36 AM, Jason Dusek wrote:

2009/08/14 John A. De Goes j...@n-brain.net:

Hmmm, my point (perhaps I wasn't clear), is that different
effects have different commutability properties. In the case
of a file system, you can commute two sequential reads from
two different files.


 I think this is a bad example -- it's not something that's
 safe in general and discredits your idea. How would the
 compiler even know that two files are not actually the same
 file?


I don't think the file system is the best example. However, I do think  
it's a reasonable one.


Let's say the type of the function getFilesInDir is annotated in such  
a way as to tell the effect system that every file in the returned  
array is unique. Further, let's say the type of the function  
makeNewTempFile is annotated in such a way as to tell the effect  
system that the function will succeed in creating a new temp file with  
a name unique from any other existing file.


Then if you write a recursive function that loops through all files in  
a directory, and for each file, it parses and compiles the file into a  
new temp file, then a sufficiently sophisticated compiler should be  
able to safely transform the recursion into parallel parsing and  
compilation -- in a way that's provably correct, assuming the original  
program was correct.


The promise of a language with a purely functional part and a powerful  
effect system for everything else is very great. And very important in  
the massively concurrent world we are entering.



 Well, yes -- which sounds like, there are no guarantees
 in general. Something that works half the time leaves you with
 two responsibilities -- the old responsibility of the work you
 did when you didn't have it and the new responsibility of
 knowing when it applies and when it doesn't.


In the other thread, I brought up the example of buffering reads.  
Library authors make the decision to buffer for one reason: because if  
some other program is messing with the data, you're screwed no matter  
what.


And yeah, they might be screwing with the data in just the way you  
need it to be screwed with, (Sebastian), in which case my advice is  
use C and hope for the best. :-)


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-15 Thread Sebastian Sylvan
On Sat, Aug 15, 2009 at 11:45 PM, John A. De Goes j...@n-brain.net wrote:

 Effect system optimizations are about taking programs that are correct, and
 transforming them to faster but equivalent programs that are still correct.


And since reordering access to externally modifiable data (external includes
memory if it's visible to other therads) is *not* safe, that shouldn't be
done. You're arguing for doing unsafe (i.e. they can cause a functioning
program to become non-functioning) transformations!

That said, your reasoning precludes the use of file read buffering, and
 other similar operations that are routinely done. It's only an illusion that
 such programs are safe, with or without transformation of sequential read
 operations.


Yes, you do have to be very careful about abstractions like that, but the
fact that we have some of that now, which can cause very hard-to-catch bugs
when you rely on ordering, is no good argument that we should add even more
of it!


-- 
Sebastian Sylvan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-15 Thread John A. De Goes

On Aug 15, 2009, at 4:59 PM, Sebastian Sylvan wrote:



Your point about safety in C has no relation to safety in a  
functional language with a sophisticated effect system.


I'm sorry, but I think it does. You're advocating that modifications  
to mutable state shouldn't have sequential semantics,


You must think I'm arguing for some kind of low-level analog of C,  
augmented with an effect system. I'm not. You can't do that.


To maximize the potential for optimization, you need a high-level  
language, mostly functional, with a very sophisticated and carefully  
controlled effects system for expressing imperative actions.


If you have threads and shared mutable state, then you might require  
that any access to shared state be done through an atomic block (STM).  
The type system would encode if mutable variables can be shared  
between threads, and if so, the compiler would mandate they be  
accessed from inside an atomic block.


In such conditions, multiple sequential writes can be safely  
parallelized, in addition to a host of other optimizations.


I'm pointing out that this is the case today in C on many CPUs and  
it's a royal pain to work with in practice (causing many almost- 
impossible-to-debug crashes). I would not want functional languages  
to adopt something that's proven to be insanity-inducingly difficult  
to use.


Please don't ever bring up C again. You can't do anything interesting  
in C.


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-15 Thread Sebastian Sylvan
On Sun, Aug 16, 2009 at 12:18 AM, John A. De Goes j...@n-brain.net wrote:

 On Aug 15, 2009, at 4:59 PM, Sebastian Sylvan wrote:


 Your point about safety in C has no relation to safety in a functional
 language with a sophisticated effect system.

 I'm sorry, but I think it does. You're advocating that modifications to
 mutable state shouldn't have sequential semantics,


 You must think I'm arguing for some kind of low-level analog of C,
 augmented with an effect system. I'm not. You can't do that.


No, I don't. I think you're arguing for making access to mutable state
commutative. Are you not?



 In such conditions, multiple sequential writes can be safely parallelized,
 in addition to a host of other optimizations.


I'm not saying you shouldn't parallelise them in very specific circumstances
*where it's safe*, I'm just saying that you shouldn't assume that it's safe
unless you know it is. If you want to do a transformation that's unsafe in
general, but safe in a specific circumstance, then of course, go ahead!
To my reading it seems like you're arguing that memory/file access should
*always* be considered commutative though, which is what I'm objecting too.


  I'm pointing out that this is the case today in C on many CPUs and it's a
 royal pain to work with in practice (causing many almost-impossible-to-debug
 crashes). I would not want functional languages to adopt something that's
 proven to be insanity-inducingly difficult to use.


 Please don't ever bring up C again. You can't do anything interesting in C.


I bring up C until you can explain how what you're suggesting is any
different from the current state in C w.r.t. the ordering of memory access.
From what you've said so far I can't see how it is, and it would be
instructive to look at the problems with the approach you're advocating
since we're dealing with its pitfalls *today* in the real world.


-- 
Sebastian Sylvan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-14 Thread John A. De Goes


Hmmm, my point (perhaps I wasn't clear), is that different effects  
have different commutability properties. In the case of a file system,  
you can commute two sequential reads from two different files. This  
has no effect on the result of the computation, assuming no  
interference from other programs -- and if there _is_ interference  
from other programs, then guarantees go out the window, _with or  
without_ commuting.


Monads are an insufficient structure to capture the fine semantics of  
effects. Something more powerful is needed. And in general, the way  
effects combine and commute is quite complicated and needs to be baked  
into the effect system, rather than being the responsibility of a  
lowly developer.


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101

On Aug 13, 2009, at 12:24 PM, Sebastian Sylvan wrote:




On Thu, Aug 13, 2009 at 2:19 PM, John A. De Goes j...@n-brain.net  
wrote:
What if you have another program, written in C or something, that  
monitors a file for changes, and if so changes the contents of  
another file? Surely to catch that you must mark *all* file system  
access as interefering? Even worse, another program could monitor  
the state of a file and conditionally disable thet network driver,  
now file access interferes with network access.



A compiler or runtime system can't know about these kinds of things  
-- unless perhaps you push the effect system into the operating  
system (interesting idea). The best you can do is ensure the program  
itself is correct in the absence of interference from other programs


I think the best you can do is make sure any code which is  
vulnerable to such interference won't be subject to unsafe  
transformations (like changing the order of evaluation).
So I do think pretty much anything that relies on the outside world  
needs to go into one big effects category so the compiler/runtime  
will stay out and let the programmer explicitly define the  
ordering of those operations, precisely because the compiler has no  
way of knowing anything about what kind of assumptions are in effect.



--
Sebastian Sylvan


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-14 Thread Jason Dagit
On Fri, Aug 14, 2009 at 1:41 PM, John A. De Goes j...@n-brain.net wrote:


 Hmmm, my point (perhaps I wasn't clear), is that different effects have
 different commutability properties. In the case of a file system, you can
 commute two sequential reads from two different files. This has no effect on
 the result of the computation, assuming no interference from other programs
 -- and if there _is_ interference from other programs, then guarantees go
 out the window, _with or without_ commuting.

 Monads are an insufficient structure to capture the fine semantics of
 effects. Something more powerful is needed. And in general, the way effects
 combine and commute is quite complicated and needs to be baked into the
 effect system, rather than being the responsibility of a lowly developer.


It's really interesting.  This is related to the reasoning darcs does with
patches (aka patch theory).  Patches tend to have effects on your
repository.  Sometimes those effects can be reordered without changing the
final state of the repository.  Other times, it is not possible to reorder
them without either having a non-sensible final state or different final
states.  I've never thought about reading research about effect systems for
the sake of version control.  I'll have to look into this.

Jason
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-14 Thread Peter Verswyvelen
I think version is control is really just a subset of a larger effect
theory. E.g. I've been experimenting with a parallel undo/redo system
in C#, where some actions can commute and be undone separately, and
for detecting this, the actions need to explicitly expose what they
will change; so this also seems in the same line of research (and has
been reported earlier in the thread darcs as undo/redo system).

And if I recall correctly, imperative compilers can reorder and
parallelize instructions based on what they read/write; again this
feels the same.

Like John said, it will be interesting when the operating system
itself exposes all dependencies (what it reads/writes), so that if
e.g. content of file A is used to generate content of file B, that
this is not some spooky action at a distance. Now the OS is treated
like a big black IO box (realworld in - realworld out), just because
we're still stuck with dumb hierarchical file systems and other opaque
IO.

Another example might be FRP / Yampa, and the recent work of Hai
(Paul) Liu, Paul Hudak and co, where causal commutative arrows are
invented. AFRP computations really commute, while standard arrows are
just a generalization of monads, so not really suitable for capturing
the parallel nature of AFRP. The way I discovered this myself, is that
when you have e.g. a large tree of user interface widgets, represented
by a big arrow circuit, and the user edits just the one widget in one
branch (which happens when e.g. the mouse is captured), then with the
current arrows system all branches will be visited depth first. But of
course only the path towards the widget that will change needs to be
visited, all the other remain constant.

Since I don't have any academic backgrounds - only intuition - I'm not
sure if these topics are related, but they sure feel like it :-)

On Fri, Aug 14, 2009 at 11:38 PM, Jason Dagitda...@codersbase.com wrote:


 On Fri, Aug 14, 2009 at 1:41 PM, John A. De Goes j...@n-brain.net wrote:

 Hmmm, my point (perhaps I wasn't clear), is that different effects have
 different commutability properties. In the case of a file system, you can
 commute two sequential reads from two different files. This has no effect on
 the result of the computation, assuming no interference from other programs
 -- and if there _is_ interference from other programs, then guarantees go
 out the window, _with or without_ commuting.
 Monads are an insufficient structure to capture the fine semantics of
 effects. Something more powerful is needed. And in general, the way effects
 combine and commute is quite complicated and needs to be baked into the
 effect system, rather than being the responsibility of a lowly developer.

 It's really interesting.  This is related to the reasoning darcs does with
 patches (aka patch theory).  Patches tend to have effects on your
 repository.  Sometimes those effects can be reordered without changing the
 final state of the repository.  Other times, it is not possible to reorder
 them without either having a non-sensible final state or different final
 states.  I've never thought about reading research about effect systems for
 the sake of version control.  I'll have to look into this.

 Jason

 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-14 Thread Dan Weston

My intuition says the proper formalism is that undo is left adjoint to redo.

They together form a monad in the category of redoable actions. return 
lifts doable actions to undoable ones by attaching an empty undo stack. 
join lowers (reflects) a first-class undoable action out of the undo 
stack and makes it doable.


The reason that the adjunction view is more fundamental here is that the 
monad is in some sense the equivalence class of all observationally 
equivalent undo/redo pairs. That is, undo need not actually restore the 
previous state: it is sufficient to restore any action that, when 
redone, gives the same state as the one before undo.


There may be justification for this idea in Rossiter et al, Process as 
a World Transaction 
(http://computing.unn.ac.uk/staff/CGNR1/anpa064.pdf), though I haven't 
had time to read it yet.


Dan

Peter Verswyvelen wrote:

I think version is control is really just a subset of a larger effect
theory. E.g. I've been experimenting with a parallel undo/redo system
in C#, where some actions can commute and be undone separately, and
for detecting this, the actions need to explicitly expose what they
will change; so this also seems in the same line of research (and has
been reported earlier in the thread darcs as undo/redo system).

And if I recall correctly, imperative compilers can reorder and
parallelize instructions based on what they read/write; again this
feels the same.

Like John said, it will be interesting when the operating system
itself exposes all dependencies (what it reads/writes), so that if
e.g. content of file A is used to generate content of file B, that
this is not some spooky action at a distance. Now the OS is treated
like a big black IO box (realworld in - realworld out), just because
we're still stuck with dumb hierarchical file systems and other opaque
IO.

Another example might be FRP / Yampa, and the recent work of Hai
(Paul) Liu, Paul Hudak and co, where causal commutative arrows are
invented. AFRP computations really commute, while standard arrows are
just a generalization of monads, so not really suitable for capturing
the parallel nature of AFRP. The way I discovered this myself, is that
when you have e.g. a large tree of user interface widgets, represented
by a big arrow circuit, and the user edits just the one widget in one
branch (which happens when e.g. the mouse is captured), then with the
current arrows system all branches will be visited depth first. But of
course only the path towards the widget that will change needs to be
visited, all the other remain constant.

Since I don't have any academic backgrounds - only intuition - I'm not
sure if these topics are related, but they sure feel like it :-)

On Fri, Aug 14, 2009 at 11:38 PM, Jason Dagitda...@codersbase.com wrote:


On Fri, Aug 14, 2009 at 1:41 PM, John A. De Goes j...@n-brain.net wrote:

Hmmm, my point (perhaps I wasn't clear), is that different effects have
different commutability properties. In the case of a file system, you can
commute two sequential reads from two different files. This has no effect on
the result of the computation, assuming no interference from other programs
-- and if there _is_ interference from other programs, then guarantees go
out the window, _with or without_ commuting.
Monads are an insufficient structure to capture the fine semantics of
effects. Something more powerful is needed. And in general, the way effects
combine and commute is quite complicated and needs to be baked into the
effect system, rather than being the responsibility of a lowly developer.

It's really interesting.  This is related to the reasoning darcs does with
patches (aka patch theory).  Patches tend to have effects on your
repository.  Sometimes those effects can be reordered without changing the
final state of the repository.  Other times, it is not possible to reorder
them without either having a non-sensible final state or different final
states.  I've never thought about reading research about effect systems for
the sake of version control.  I'll have to look into this.

Jason

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe




___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-14 Thread Sebastian Sylvan
On Fri, Aug 14, 2009 at 9:41 PM, John A. De Goes j...@n-brain.net wrote:


 Hmmm, my point (perhaps I wasn't clear), is that different effects have
 different commutability properties. In the case of a file system, you can
 commute two sequential reads from two different files.


But you can't! I can easily envisage a scenario where there's a link between
two pieces of data in two different files, where it's okay if the data in
file A is newer (in a versioning sense, not a timestamp sense) than the
corresponding data in file B, but the opposite doesn't hold. So if you have
another program writing that data it will write first to A, and then to B.
The program reading this *must* then read the files in the correct order (B
then A, to ensure the data from A is always newer or at the same version
as the data in B).

Anytime you talk to the outside world, there may be implicit ordering that
you need to respect, so I really think that needs to be serialized.
Of course, there may be things in the IO monad that doesn't talk to the
outside world that could be commutative.



-- 
Sebastian Sylvan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-14 Thread John A. De Goes

On Aug 14, 2009, at 8:21 PM, Sebastian Sylvan wrote:

But you can't! I can easily envisage a scenario where there's a link  
between two pieces of data in two different files, where it's okay  
if the data in file A is newer (in a versioning sense, not a  
timestamp sense) than the corresponding data in file B, but the  
opposite doesn't hold. So if you have another program writing that  
data it will write first to A, and then to B. The program reading  
this *must* then read the files in the correct order (B then A, to  
ensure the data from A is always newer or at the same version as  
the data in B).


That's nonsense. Because what happens if your program reads A while  
the other program is writing to A, or reads B just after the other  
program has written to A, but before it has written to B?


As I said before, you cannot make any guarantees in the presence of  
interference, _with or without_ commuting. So any sensible effects  
system will assume no interference (perhaps with unsafeXXX functions  
for when you're OK with shooting yourself in the foot).


Anytime you talk to the outside world, there may be implicit  
ordering that you need to respect, so I really think that needs to  
be serialized.
Of course, there may be things in the IO monad that doesn't talk to  
the outside world that could be commutative.


If you don't like the file system, consider mutable memory. An effect  
system will tell me I can safely update two pieces of non-overlapping,  
contiguous memory concurrently, even in different threads if the  
complexity so justifies it. The IO monad is a poor man's solution to  
the problem of effects.


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-14 Thread Sebastian Sylvan
On Sat, Aug 15, 2009 at 3:55 AM, John A. De Goes j...@n-brain.net wrote:

 On Aug 14, 2009, at 8:21 PM, Sebastian Sylvan wrote:

  But you can't! I can easily envisage a scenario where there's a link
 between two pieces of data in two different files, where it's okay if the
 data in file A is newer (in a versioning sense, not a timestamp sense)
 than the corresponding data in file B, but the opposite doesn't hold. So if
 you have another program writing that data it will write first to A, and
 then to B. The program reading this *must* then read the files in the
 correct order (B then A, to ensure the data from A is always newer or at the
 same version as the data in B).


 That's nonsense. Because what happens if your program reads A while the
 other program is writing to A


Depends on the file system. For example, the file could be locked so you
would just block. Or the file system might be transactional. I used files as
an example, the point wasn't to get bogged down in exact semantics of
concurrent access - assume all reads/writes are atomic (or can be viewed as
such from the apps POV) for the purposes of discussion.


 , or reads B just after the other program has written to A, but before it
 has written to B?


This is precisely the scenario that would be legal in that case. You'd end
up with data from A that's newer than B, which we said was okay, but for
some app-specific reason the opposite is *not* okay. In order for you not to
crash you *have* to make sure you read from B first, because otherwise you
could read from A right before it's updated, and then read B right after
both A and B have been updated which means B is now newer than A and your
program goes boom.

 The point is that the ordering of reads are not arbitrary.


 As I said before, you cannot make any guarantees in the presence of
 interference, _with or without_ commuting.


That's a separate issue. The problem is that if you *do* depend on outside
interference, then the sequence of operations matters.


-- 
Sebastian Sylvan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-14 Thread Sebastian Sylvan
On Sat, Aug 15, 2009 at 3:55 AM, John A. De Goes j...@n-brain.net wrote:

 If you don't like the file system, consider mutable memory. An effect
 system will tell me I can safely update two pieces of non-overlapping,
 contiguous memory concurrently, even in different threads if the complexity
 so justifies it.


I'd like to point out that this relaxation of sequencing for memory
operations is already in effect in C on many CPUs. Even though you write
things sequentially, it doesn't actually happen sequentially unless you
explicitly say so with memory barriers. This causes massive head-aches and
horrible bugs that are almost impossible to track down whenever you actually
do depend on the order (usually in multi-threading scenarios, e.g. lockless
data structures).

The point is, the safer option is to enforce a sequential model (like
Haskell does), since that way you can always rely on ordering even if you
don't even realise you need to, and there's plenty of practical experience
indicating that the other option (explicit barriers to indicate when
something isn't commutative) is sheer madness.


-- 
Sebastian Sylvan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-13 Thread Sebastian Sylvan
On Thu, Aug 13, 2009 at 4:56 AM, John A. De Goes j...@n-brain.net wrote:


 The next step is to distinguish between reading file A and reading file B,

between reading file A and writing file A, between reading one part of file
 A and writing another part of file A, etc. When the effect system can carry
 that kind of information, and not just for files, but network, memory, etc.,
 then you'll be able to do some extremely powerful parallelization 
 optimization.


 What if you have another program, written in C or something, that
monitors a file for changes, and if so changes the contents of another file?
Surely to catch that you must mark *all* file system access as
interefering? Even worse, another program could monitor the state of a
file and conditionally disable thet network driver, now file access
interferes with network access.


-- 
Sebastian Sylvan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-13 Thread Conor McBride

Hi Dan

On 12 Aug 2009, at 22:28, Dan Doel wrote:


On Wednesday 12 August 2009 10:12:14 am John A. De Goes wrote:

I think the point is that a functional language with a built-
in effect system that captures the nature of effects is pretty damn
cool and eliminates a lot of boilerplate.


It's definitely an interesting direction (possibly even the right  
one in the
long run), but it's not without its faults currently (unless things  
have

changed since I looked at it).


From what I've seen, I think we should applaud Ben for
kicking the open here.

Is Haskell over-committed to monads? Does Haskell make
a sufficient distinction between notions of value and
notions of computation in its type system?

For instance: what effects does disciple support? Mutation and IO?  
What if I
want non-determinism, or continuations, etc.? How do I as a user add  
those
effects to the effect system, and specify how they should interact  
with the
other effects? As far as I know, there aren't yet any provisions for  
this, so
presumably you'll end up with effect system for effects supported by  
the

compiler, and monads for effects you're writing yourself.

By contrast, monad transformers (for one) let you do the above  
defining of new
effects, and specifying how they interact (although they're  
certainly neither

perfect, nor completely satisfying theoretically).

Someone will probably eventually create (or already has, and I don't  
know
about it) an extensible effect system that would put this objection  
to rest.

Until then, you're dealing in trade offs.


It's still very much on the drawing board, but I once
flew a kite called Frank which tried to do something
of the sort (http://cs.ioc.ee/efftt/mcbride-slides.pdf).

Frank distinguishes value types from computation
types very much in the manner of Paul Blain Levy's
call-by-push-value. You make a computation type from
a value type v by attaching a capabilty to it (a
possibly empty set of interfaces which must be
enabled) [i_1,..i_n]v. You make a value type from a
computation type c by suspending it {c}. Expressions
are typed with value components matched up in the usual
way and capabilities checked for inclusion in the ambient
capability. That is, you don't need idiom-brackets
because you're always in them: it's just a question
of which idiom, as tracked by type.

There's a construct to extend the ambient idiom by
providing a listener for an interface (subtly
disguised, a homomorphism from the free monad on the
interface to the outer notion of computation).
Listeners can transform the value type of the
computations they interpret: a listener might offer
the throw capability to a computation of type t,
and deliver a pure computation of type Maybe t. But
[Throw]t and []Maybe t are distinguished, unlike
in Haskell. Moreover {[]t} and t are distinct: the
former is lazy, the latter strict, but there is no
reason why we should ever evaluate a pure thunk
more than once, even if it is forced repeatedly.

I agree with Oleg's remarks, elsewhere in this thread:
there is a middle ground to be found between ML and
Haskell. We should search with open minds.

All the best

Conor

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-13 Thread Jason Dusek
2009/08/12 John A. De Goes j...@n-brain.net:
 The next step is to distinguish between reading file A and
 reading file B, between reading file A and writing file A,
 between reading one part of file A and writing another part of
 file A, etc. When the effect system can carry that kind of
 information, and not just for files, but network, memory,
 etc., then you'll be able to do some extremely powerful
 parallelization  optimization.

  I am challenged to imagine optimizations that would be safe in
  the case of File I/O.

--
Jason Dusek
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-13 Thread Alberto G. Corona
Another issue in DCC is in the course of writing a procedure, is that either
the programmer has too much information (the list of effects of all the
called procedures  if they are explicit), or too little, if they are
generated and managed by the compiler.
How he knows for sure that a variable to be used in the next statement is
pure or it would be updated by the previous function call?.  if the list of
effects of a procedure is hidden or worse, contains a lot of information,
Isn`t this a problem?.

In contrast, the division of the world in pure/impure operations is
relaxing. Ok , after the  @ operator I´m sure that everithing is pure, but
things are not clear outside. At least in Haskell, trough monads, we have a
clear signature about the effects we are dealiing with. If IO can be
considered an effect.

Maybe, in Haskell, the coarse IO monad can be divided in smaller monads as
well, in the same but reverse way than DCC can handle the whole IO as a
single effect (I guess)?

2009/8/13 Jason Dusek jason.du...@gmail.com

 2009/08/12 John A. De Goes j...@n-brain.net:
  The next step is to distinguish between reading file A and
  reading file B, between reading file A and writing file A,
  between reading one part of file A and writing another part of
  file A, etc. When the effect system can carry that kind of
  information, and not just for files, but network, memory,
  etc., then you'll be able to do some extremely powerful
  parallelization  optimization.

   I am challenged to imagine optimizations that would be safe in
  the case of File I/O.

 --
 Jason Dusek
 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-13 Thread John A. De Goes
What if you have another program, written in C or something, that  
monitors a file for changes, and if so changes the contents of  
another file? Surely to catch that you must mark *all* file system  
access as interefering? Even worse, another program could monitor  
the state of a file and conditionally disable thet network driver,  
now file access interferes with network access.



A compiler or runtime system can't know about these kinds of things --  
unless perhaps you push the effect system into the operating system  
(interesting idea). The best you can do is ensure the program itself  
is correct in the absence of interference from other programs, but  
there's no way to obtain a guarantee in the presence of interference.  
Either with an effect system, or without (think of all the sequential  
imperative code that gets broken when other programs concurrently  
tamper with the file system or networking, etc.).


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101

On Aug 13, 2009, at 2:42 AM, Sebastian Sylvan wrote:




On Thu, Aug 13, 2009 at 4:56 AM, John A. De Goes j...@n-brain.net  
wrote:


The next step is to distinguish between reading file A and reading  
file B,
between reading file A and writing file A, between reading one part  
of file A and writing another part of file A, etc. When the effect  
system can carry that kind of information, and not just for files,  
but network, memory, etc., then you'll be able to do some extremely  
powerful parallelization  optimization.


What if you have another program, written in C or something, that  
monitors a file for changes, and if so changes the contents of  
another file? Surely to catch that you must mark *all* file system  
access as interefering? Even worse, another program could monitor  
the state of a file and conditionally disable thet network driver,  
now file access interferes with network access.



--
Sebastian Sylvan


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-13 Thread John A. De Goes


Hmmm, bad example. Assume memory instead. That said, reordering/ 
parallelization of *certain combinations of* writes/reads to  
independent files under whole program analysis is no less safe than  
sequential writes/reads. It just feels less safe, but the one thing  
that will screw both up is interference from outside programs.


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101

On Aug 13, 2009, at 3:45 AM, Jason Dusek wrote:


2009/08/12 John A. De Goes j...@n-brain.net:

The next step is to distinguish between reading file A and
reading file B, between reading file A and writing file A,
between reading one part of file A and writing another part of
file A, etc. When the effect system can carry that kind of
information, and not just for files, but network, memory,
etc., then you'll be able to do some extremely powerful
parallelization  optimization.


 I am challenged to imagine optimizations that would be safe in
 the case of File I/O.

--
Jason Dusek


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-13 Thread John A. De Goes

On Aug 13, 2009, at 4:09 AM, Alberto G. Corona wrote:

Maybe, in Haskell, the coarse IO monad can be divided in smaller  
monads as well



I don't even want to imagine how that would obfuscate otherwise  
straightforward looking monadic code.


The root problem is that monads don't capture enough information on  
the nature of effects, which limits composability. What's needed is  
something richer, which gives the compiler enough information to do  
all those things that make life easy for the developer, whilst  
maximizing the performance of the application. DDC is an interesting  
experiment in that direction.


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-13 Thread Gregory Michael Travis

John A. De Goes j...@n-brain.net sed:
 Hmmm, bad example. Assume memory instead. That said, reordering/
 parallelization of *certain combinations of* writes/reads to
 independent files under whole program analysis is no less safe than
 sequential writes/reads. It just feels less safe, but the one thing
 that will screw both up is interference from outside programs.

This reminds me of something I've heard of: the commutative monad.  I
might have seen the phrase in some presentation by Simon Peyton-Jones.

I took it to mean some sort of monad that sequenced computations, but
was also capable of reordering them according to rules specified by
the implementer of the monad.  Thus, you could re-order operations on
separate files, because they can't possible interfere with each other.

This strikes me as an important idea, something that would allow you
to incrementally optimize while keeping safety.

For example, let's say you implement a complex distributed system,
using a single distributed state monad to manage global state.  This
would be very slow, since you would no doubt have to have a central
node controlling the order of all side-effects.  But this node could
re-group and re-distributed the side-effects.  This would be done by
using the type system to carefully separate effects that couldn't
interfere.

I'm not expressing this as well as I'd like.  From the programmer's
perspective, though, it could have a nice property: you start with a
slow, but trivial state monad.  Then you add rules and clauses that
indicate when two effects can have their order switched; all the
while, the meaning of the program is unchanged.

This is not unlike nodes distributed database that have to receive
updates from other nodes and figure out how to apply all the updates
in a coherent order, or throw an exception if it cannot.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-13 Thread Sebastian Sylvan
On Thu, Aug 13, 2009 at 2:19 PM, John A. De Goes j...@n-brain.net wrote:

 What if you have another program, written in C or something, that
 monitors a file for changes, and if so changes the contents of another file?
 Surely to catch that you must mark *all* file system access as
 interefering? Even worse, another program could monitor the state of a
 file and conditionally disable thet network driver, now file access
 interferes with network access.


 A compiler or runtime system can't know about these kinds of things --
 unless perhaps you push the effect system into the operating system
 (interesting idea). The best you can do is ensure the program itself is
 correct in the absence of interference from other programs


I think the best you can do is make sure any code which is vulnerable to
such interference won't be subject to unsafe transformations (like changing
the order of evaluation).
So I do think pretty much anything that relies on the outside world needs to
go into one big effects category so the compiler/runtime will stay out
and let the programmer explicitly define the ordering of those operations,
precisely because the compiler has no way of knowing anything about what
kind of assumptions are in effect.


-- 
Sebastian Sylvan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Peter Verswyvelen
I knew about DDC for a while but never bothered to look at it deeply
since it was so young, and at first sight it just did what ML did
(silently allowing side effects).

But when looking again at it, it seems it actually does some very
clever things to allow side effects and still be safe, a bit what one
does explicitly with Haskell's ST and IO monads, but in DDC, the
compiler seems to analyze where side effects happen and automatically
annotates/enrich the types (with things like region, closure and
effect information) to indicate this. The example given by DDC is that
in Haskell, you need two versions of each higher order function if you
really want your library to be reuseable, e.g. you need map and mapM,
but in DDC you just need one function, map, which also is written in a
pure style. See e.g.
http://www.haskell.org/haskellwiki/DDC/EffectSystem

I kind of agree with the DDC authors here; in Haskell as soon as a
function has a side effect, and you want to pass that function to a
pure higher order function, you're stuck, you need to pick the monadic
version of the higher order function, if it exists. So Haskell doesn't
really solve the modularity problem, you need two versions of each
higher order function really, and you need to carefully plan ahead of
time if you want to write in a monadic or pure style, and it also
splits my brain mentally (side effects are ***evil***. But hang on,
how can I e.g. do a depth first search efficiently purely functional?
Damn, it seems I can't! Let's switch back to C#/C++!!! No ;-) This
seems to make Haskell not a good language for doing things like
extreme programming where code evolves, unless you have some insanely
clever refactoring tool ready that can convert pure into monadic
functions and vice versa.

Is it true that - as in DDC - side effects can be analyzed by the
compiler (in the same sense that say, strictness is analyzed), freeing
the library and user from writing/picking monadic versions??? If so,
would this indeed solve the modularity/reusablity issue? I feel you
can't have your cake and eat it here, so there must be catch? :-)

Thanks for sharing any thoughts on this,
Peter Verswyvelen
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Pavel Perikov

unless you have some insanely
clever refactoring tool ready that can convert pure into monadic
functions and vice versa.


Not trying to attack the idea, just some thoughts:

I don't see much problem converting pure function into an effectfull  
form :) Having pure function

myPureFunction::a1-a2-a3-res

myPureFunction a1 a2 a3 -- pure call
myPureFunction $ a1 * a2 * a3 -- call using Applicative

Probably, introducing some conventions and using some language  
extensions you can write a function taking pure function and  
converting it into monadic version (a-la SPJ' work on polyvariadic  
composition etc [1])


Have monadic function but needs to call it from pure code? use  
Control.Monad.Identity.


Not a big deal to me but maybe I'm missing the point

-
[1] http://okmij.org/ftp/Haskell/types.html#polyvar-comp



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Peter Verswyvelen
Well, the point is that you still have monadic and pure programming
styles. It's true that applicative style programming can help here,
but then you have these $ and * operators everywhere, which also
feels like boilerplate code (as you mention, some extensions could
help here)

Monadic style even has the do syntax, which really looks imperative,
while of course, it isn't.

I know that for e.g. doing a DFS I can use the ST monad and still be
pure, however, my code *looks* imperative when I use the do syntax,
which mentally feels weird (maybe it's just me being mentally ill ;-)

It's even worse with the arrows syntax: without it I wouldn't be able
to write complex Yampa stuff, but the syntax makes me feels that I'm
drawing boxes and links using ASCII, and suddenly it *feels* different
than when doing nice pure / applicative style programming. Again,
maybe it's just my brain that is fucked up by 25 years of imperative
programming :)

So to me, keeping the syntax in a pure style (a bit like ML and F#)
plays nicer with my brain.

On Wed, Aug 12, 2009 at 11:13 AM, Pavel Perikovperi...@gmail.com wrote:
 unless you have some insanely
 clever refactoring tool ready that can convert pure into monadic
 functions and vice versa.

 Not trying to attack the idea, just some thoughts:

 I don't see much problem converting pure function into an effectfull form
 :) Having pure function
 myPureFunction::a1-a2-a3-res
 
 myPureFunction a1 a2 a3 -- pure call
 myPureFunction $ a1 * a2 * a3 -- call using Applicative

 Probably, introducing some conventions and using some language extensions
 you can write a function taking pure function and converting it into monadic
 version (a-la SPJ' work on polyvariadic composition etc [1])

 Have monadic function but needs to call it from pure code? use
 Control.Monad.Identity.

 Not a big deal to me but maybe I'm missing the point

 -
 [1] http://okmij.org/ftp/Haskell/types.html#polyvar-comp



 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Pavel Perikov


On 12.08.2009, at 13:27, Peter Verswyvelen wrote:


Well, the point is that you still have monadic and pure programming
styles. It's true that applicative style programming can help here,
but then you have these $ and * operators everywhere, which also
feels like boilerplate code (as you mention, some extensions could
help here)


And you have higher-order functions, you're right. My reply was a bit  
rushed ;)


P.








Monadic style even has the do syntax, which really looks imperative,
while of course, it isn't.

I know that for e.g. doing a DFS I can use the ST monad and still be
pure, however, my code *looks* imperative when I use the do syntax,
which mentally feels weird (maybe it's just me being mentally ill ;-)

It's even worse with the arrows syntax: without it I wouldn't be able
to write complex Yampa stuff, but the syntax makes me feels that I'm
drawing boxes and links using ASCII, and suddenly it *feels* different
than when doing nice pure / applicative style programming. Again,
maybe it's just my brain that is fucked up by 25 years of imperative
programming :)

So to me, keeping the syntax in a pure style (a bit like ML and F#)
plays nicer with my brain.

On Wed, Aug 12, 2009 at 11:13 AM, Pavel Perikovperi...@gmail.com  
wrote:

unless you have some insanely
clever refactoring tool ready that can convert pure into monadic
functions and vice versa.


Not trying to attack the idea, just some thoughts:

I don't see much problem converting pure function into an  
effectfull form

:) Having pure function
myPureFunction::a1-a2-a3-res

myPureFunction a1 a2 a3 -- pure call
myPureFunction $ a1 * a2 * a3 -- call using Applicative

Probably, introducing some conventions and using some language  
extensions
you can write a function taking pure function and converting it  
into monadic

version (a-la SPJ' work on polyvariadic composition etc [1])

Have monadic function but needs to call it from pure code? use
Control.Monad.Identity.

Not a big deal to me but maybe I'm missing the point

-
[1] http://okmij.org/ftp/Haskell/types.html#polyvar-comp



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re[2]: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Bulat Ziganshin
Hello Pavel,

Wednesday, August 12, 2009, 1:13:31 PM, you wrote:

 Have monadic function but needs to call it from pure code? use
 Control.Monad.Identity.

by monadic function he probably meant ST or IO one, not polymorphic by
any monad

-- 
Best regards,
 Bulatmailto:bulat.zigans...@gmail.com

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: Re[2]: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Peter Verswyvelen
Yes, sorry.

But I think I already found the answer to my own question.

DDC functions that are lazy don't allow side effects:
http://www.haskell.org/haskellwiki/DDC/EvaluationOrder

Anyway it would be cool if the DDC EffectSystem would also work on
lazy functions :)

On Wed, Aug 12, 2009 at 11:28 AM, Bulat
Ziganshinbulat.zigans...@gmail.com wrote:
 Hello Pavel,

 Wednesday, August 12, 2009, 1:13:31 PM, you wrote:

 Have monadic function but needs to call it from pure code? use
 Control.Monad.Identity.

 by monadic function he probably meant ST or IO one, not polymorphic by
 any monad

 --
 Best regards,
  Bulat                            mailto:bulat.zigans...@gmail.com

 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Jules Bean

Peter Verswyvelen wrote:

I kind of agree with the DDC authors here; in Haskell as soon as a
function has a side effect, and you want to pass that function to a
pure higher order function, you're stuck, you need to pick the monadic
version of the higher order function, if it exists. 


I just want to point out that you could *always* use the monadic version.

That is, mapM subsumes map. Just use the Identity monad if you don't 
have anything really monadic going on.


The reason we don't is that that looks + feels ugly.

Jules

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Robin Green
On Wed, 12 Aug 2009 11:37:02 +0200
Peter Verswyvelen bugf...@gmail.com wrote:

 Yes, sorry.
 
 But I think I already found the answer to my own question.
 
 DDC functions that are lazy don't allow side effects:
 http://www.haskell.org/haskellwiki/DDC/EvaluationOrder
 
 Anyway it would be cool if the DDC EffectSystem would also work on
 lazy functions :)

As was just pointed out in the unsafeDestructiveAssign thread from which
this thread was forked, effects are incompatible with non-strict
evaluation. The compiler is supposed to be able to reorder non-strict
evaluation to do optimisations, but that can't be done if effects
could happen. Also, effects would destroy modular reasoning.
-- 
Robin
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Peter Verswyvelen
Is this really the case? Or is just hard to implement?

I mean, if...then...else is always kind of lazy in it's 2nd and 3rd
argument, but I think DDC handles this correctly even with the
presence of side effects (not sure, but it has a little presentation
about it: 
http://cs.anu.edu.au/people/Ben.Lippmeier/talks/poisoning-20090618.pdf)

And it's still possible to reason about a function with side effects
in DDC since the side effects are visible in the type signature (you
might need a good editor that shows you the inferred signatures, as
done with F#). I mean it's possible to reason about monadic ST or IO
no? I agree it's hard to reason about code that is not honest about
side effects in its type signature.

So couldn't this be generalized? If might completely miss the point
here since I'm not at all academic, I'm an old school imperative
hacker.








On Tue, Aug 11, 2009 at 10:51 PM, Robin Greengree...@greenrd.org wrote:
 As was just pointed out in the unsafeDestructiveAssign thread from which
 this thread was forked, effects are incompatible with non-strict
 evaluation. The compiler is supposed to be able to reorder non-strict
 evaluation to do optimisations, but that can't be done if effects
 could happen. Also, effects would destroy modular reasoning.
 --
 Robin
 ___
 Haskell-Cafe mailing list
 Haskell-Cafe@haskell.org
 http://www.haskell.org/mailman/listinfo/haskell-cafe

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Tom Davies


On 12/08/2009, at 9:09 PM, Peter Verswyvelen wrote:


Is this really the case? Or is just hard to implement?

I mean, if...then...else is always kind of lazy in it's 2nd and 3rd
argument, but I think DDC handles this correctly even with the
presence of side effects (not sure, but it has a little presentation
about it: 
http://cs.anu.edu.au/people/Ben.Lippmeier/talks/poisoning-20090618.pdf)



As I haven't seen it mentioned on this thread: Ben Lippmeier's PhD  
thesis is available:

http://cs.anu.edu.au/~Ben.Lippmeier/project/thesis/thesis-lippmeier-sub.pdf

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Derek Elkins
On Tue, Aug 11, 2009 at 3:51 PM, Robin Greengree...@greenrd.org wrote:
 On Wed, 12 Aug 2009 11:37:02 +0200
 Peter Verswyvelen bugf...@gmail.com wrote:

 Yes, sorry.

 But I think I already found the answer to my own question.

 DDC functions that are lazy don't allow side effects:
 http://www.haskell.org/haskellwiki/DDC/EvaluationOrder

 Anyway it would be cool if the DDC EffectSystem would also work on
 lazy functions :)

 As was just pointed out in the unsafeDestructiveAssign thread from which
 this thread was forked, effects are incompatible with non-strict
 evaluation.

No, they aren't.  At least, they aren't in any technical way.  There
have been more than a few languages supporting both laziness and
mutation starting with Algol.

 The compiler is supposed to be able to reorder non-strict
 evaluation to do optimisations, but that can't be done if effects
 could happen.

There's nothing special about non-strict evaluation that makes the
antecedent true.  Replacing non-strict with strict gives just as
much of a valid statement.  It is purity that allows (some) reordering
of evaluation.

 Also, effects would destroy modular reasoning.

Again, it is purity, not laziness, that allows compositional
reasoning.  Effects destroy compositional reasoning in a strict
language just as much.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread John A. De Goes

On Aug 12, 2009, at 7:34 AM, Derek Elkins wrote:


Again, it is purity, not laziness, that allows compositional
reasoning.  Effects destroy compositional reasoning in a strict
language just as much.



Yes, but that's just as much true in the IO monad as in effectful code  
in DDC. I think the point is that a functional language with a built- 
in effect system that captures the nature of effects is pretty damn  
cool and eliminates a lot of boilerplate.


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Jason Dagit
On Wed, Aug 12, 2009 at 6:34 AM, Derek Elkins derek.a.elk...@gmail.comwrote:


 Again, it is purity, not laziness, that allows compositional
 reasoning.  Effects destroy compositional reasoning in a strict
 language just as much.


Totality also matters, but for some reason we take that for granted :)

Jason
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Robin Green
On Wed, 12 Aug 2009 08:34:28 -0500
Derek Elkins derek.a.elk...@gmail.com wrote:

 On Tue, Aug 11, 2009 at 3:51 PM, Robin Greengree...@greenrd.org
 wrote:
  On Wed, 12 Aug 2009 11:37:02 +0200
  Peter Verswyvelen bugf...@gmail.com wrote:
 
  Yes, sorry.
 
  But I think I already found the answer to my own question.
 
  DDC functions that are lazy don't allow side effects:
  http://www.haskell.org/haskellwiki/DDC/EvaluationOrder
 
  Anyway it would be cool if the DDC EffectSystem would also work on
  lazy functions :)
 
  As was just pointed out in the unsafeDestructiveAssign thread from
  which this thread was forked, effects are incompatible with
  non-strict evaluation.
 
 No, they aren't.  At least, they aren't in any technical way.  There
 have been more than a few languages supporting both laziness and
 mutation starting with Algol.

OK, explicitly creating thunks, like in Algol, LISP and CAL, can work,
because you can either prevent the programmer from using mutation
inside a thunk (as in the CAL approach), or rely on the programmer to
ensure that mutations are only done in a safe order (as in the
callback as thunk approach, which can be done in almost any impure
language).

But if almost *every* expression is a thunk, as in a non-strict
language like Haskell, and if moreover evaluation order can differ
depending on the compiler/interpreter, or compiler version, or compiler
flags, it becomes impractical to combine them. So yes, I agree, it's
not an absolute technical incompatibility, it's a practical one.

  Also, effects would destroy modular reasoning.
 
 Again, it is purity, not laziness, that allows compositional
 reasoning.  Effects destroy compositional reasoning in a strict
 language just as much.

Not really - in a strict language, if in method M action A always
happens before action B, then that fact will remain true in whichever
context M is called (ignoring threading issues). In a non-strict, impure
language, assuming monads are not used, if you have a function f which
performs actions A and B, and two other functions g and h which both
call f, the actions could happen in one order in g, and in the opposite
order - or not at all - in h, because of a difference in data demanded
by each function. Indeed, unless g and h are top-level functions (in
the sense that they are main functions or invoked from some
ghci-equivalent) the notion of the order the actions are performed in
is ill-defined, because it could vary depending on which function g or
h is being called from.

Now, you could say, the strict order of evaluation can be simulated in
a non-strict language on a case-by-case basis by using monads or
whatever. Well, yes, but that would be missing the point, because the
original point of this discussion was to avoid having to use monads.
-- 
Robin
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread David Menendez
On Wed, Aug 12, 2009 at 9:34 AM, Derek Elkinsderek.a.elk...@gmail.com wrote:
 On Tue, Aug 11, 2009 at 3:51 PM, Robin Greengree...@greenrd.org wrote:
 On Wed, 12 Aug 2009 11:37:02 +0200
 Peter Verswyvelen bugf...@gmail.com wrote:

 Yes, sorry.

 But I think I already found the answer to my own question.

 DDC functions that are lazy don't allow side effects:
 http://www.haskell.org/haskellwiki/DDC/EvaluationOrder

 Anyway it would be cool if the DDC EffectSystem would also work on
 lazy functions :)

 As was just pointed out in the unsafeDestructiveAssign thread from which
 this thread was forked, effects are incompatible with non-strict
 evaluation.

 No, they aren't.  At least, they aren't in any technical way.  There
 have been more than a few languages supporting both laziness and
 mutation starting with Algol.

As far as I know, Algol had call-by-name, not call-by-need.

-- 
Dave Menendez d...@zednenem.com
http://www.eyrie.org/~zednenem/
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Felipe Lessa
On Wed, Aug 12, 2009 at 08:34:28AM -0500, Derek Elkins wrote:
  As was just pointed out in the unsafeDestructiveAssign thread from which
  this thread was forked, effects are incompatible with non-strict
  evaluation.

 No, they aren't.  At least, they aren't in any technical way.  There
 have been more than a few languages supporting both laziness and
 mutation starting with Algol.

But laziness is just one way of implementing non-strictness,
right?  What about others methods?

--
Felipe.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Don Stewart
bugfact:
 Well, the point is that you still have monadic and pure programming
 styles. It's true that applicative style programming can help here,
 but then you have these $ and * operators everywhere, which also
 feels like boilerplate code (as you mention, some extensions could
 help here)

Overloading whitespace as application in the Identity idiom! :)

-- Do
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Conor McBride


On 12 Aug 2009, at 20:40, Don Stewart wrote:


bugfact:

Well, the point is that you still have monadic and pure programming
styles. It's true that applicative style programming can help here,
but then you have these $ and * operators everywhere, which also
feels like boilerplate code (as you mention, some extensions could
help here)


Overloading whitespace as application in the Identity idiom! :)


[cough!]

Conor

psst: http://personal.cis.strath.ac.uk/~conor/pub/she/idiom.html

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Dan Doel
On Wednesday 12 August 2009 10:12:14 am John A. De Goes wrote:
 I think the point is that a functional language with a built-
 in effect system that captures the nature of effects is pretty damn
 cool and eliminates a lot of boilerplate.

It's definitely an interesting direction (possibly even the right one in the 
long run), but it's not without its faults currently (unless things have 
changed since I looked at it).

For instance: what effects does disciple support? Mutation and IO? What if I 
want non-determinism, or continuations, etc.? How do I as a user add those 
effects to the effect system, and specify how they should interact with the 
other effects? As far as I know, there aren't yet any provisions for this, so 
presumably you'll end up with effect system for effects supported by the 
compiler, and monads for effects you're writing yourself.

By contrast, monad transformers (for one) let you do the above defining of new 
effects, and specifying how they interact (although they're certainly neither 
perfect, nor completely satisfying theoretically).

Someone will probably eventually create (or already has, and I don't know 
about it) an extensible effect system that would put this objection to rest. 
Until then, you're dealing in trade offs.

-- Dan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Jason Dusek
2009/08/12 Dan Doel dan.d...@gmail.com:
 On Wednesday 12 August 2009 10:12:14 am John A. De Goes wrote:
  I think the point is that a functional language with a
  built- in effect system that captures the nature of effects
  is pretty damn cool and eliminates a lot of boilerplate.

 It's definitely an interesting direction (possibly even the
 right one in the long run), but it's not without its faults
 currently (unless things have changed since I looked at it).

 For instance: what effects does disciple support? Mutation and
 IO? What if I want non-determinism, or continuations, etc.?
 How do I as a user add those effects to the effect system, and
 specify how they should interact with the other effects? As
 far as I know, there aren't yet any provisions for this, so
 presumably you'll end up with effect system for effects
 supported by the compiler, and monads for effects you're
 writing yourself.

  +1

--
Jason Dusek
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread John A. De Goes


So what, because effect systems might not eliminate *all* boilerplate,  
you'd rather use boilerplate 100% of the time? :-)


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101

On Aug 12, 2009, at 3:28 PM, Dan Doel wrote:


On Wednesday 12 August 2009 10:12:14 am John A. De Goes wrote:

I think the point is that a functional language with a built-
in effect system that captures the nature of effects is pretty damn
cool and eliminates a lot of boilerplate.


It's definitely an interesting direction (possibly even the right  
one in the
long run), but it's not without its faults currently (unless things  
have

changed since I looked at it).

For instance: what effects does disciple support? Mutation and IO?  
What if I
want non-determinism, or continuations, etc.? How do I as a user add  
those
effects to the effect system, and specify how they should interact  
with the
other effects? As far as I know, there aren't yet any provisions for  
this, so
presumably you'll end up with effect system for effects supported by  
the

compiler, and monads for effects you're writing yourself.

By contrast, monad transformers (for one) let you do the above  
defining of new
effects, and specifying how they interact (although they're  
certainly neither

perfect, nor completely satisfying theoretically).

Someone will probably eventually create (or already has, and I don't  
know
about it) an extensible effect system that would put this objection  
to rest.

Until then, you're dealing in trade offs.

-- Dan


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Dan Doel
On Wednesday 12 August 2009 9:27:30 pm John A. De Goes wrote:
 So what, because effect systems might not eliminate *all* boilerplate,
 you'd rather use boilerplate 100% of the time? :-)

For most of my Haskell programs, the majority of the program is not made up of 
straight IO or ST functions, so how much boilerplate is it really eliminating? 
And since all the fooM functions have to exist for all the other monads, how 
much more boilerplate is it really to use them for IO and ST as well?

Off hand, I'd say I don't write foo and fooM versions of functions much in 
actual programs, either. Such duplication goes into libraries, and that would 
be the case for Disciple as well (until there's a way to extend the effect 
system outside the compiler).

So it's more of effect systems eliminate 2% of my boilerplate; who 
(currently) cares? And is it worth having situations like 'you use monads for 
these effects, and the effect system for these other effects' in the language 
to do so?

But, as I said, it's not that I don't think it's a fruitful area of research, 
I just don't think it's going to yield significantly nicer code than Haskell 
_yet_.

-- Dan
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Ben Lippmeier

Derek Elkins wrote:

The compiler is supposed to be able to reorder non-strict
evaluation to do optimisations, but that can't be done if effects
could happen.



There's nothing special about non-strict evaluation that makes the
antecedent true.  Replacing non-strict with strict gives just as
much of a valid statement.  It is purity that allows (some) reordering
of evaluation.
  

Here are two effectful statements that can safely be reordered.

 print foo
 x := 5


here are two more

 y := 2
 z := 3

(provided y and z don't alias)


Purity allows some reordering of evaluation, so does knowing that
two effectful computations won't interfere.


Ben.



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Ben Lippmeier

Dan Doel wrote:
For instance: what effects does disciple support? Mutation and IO? 

You can create your own top-level effects which interfere
will all others, for example:

effect !Network;
effect !File;

readFile :: String -(!e) String
:- !e = !File

Now any function that calls readFile will also have a !File effect.

What if I 
want non-determinism, or continuations, etc.? How do I as a user add those 
effects to the effect system, and specify how they should interact with the 
other effects? As far as I know, there aren't yet any provisions for this, so 
presumably you'll end up with effect system for effects supported by the 
compiler, and monads for effects you're writing yourself.
  

Yep.

In Disciple, a computation has an effect if its evaluation cannot
safely be reordered with others having the same effect. That is,
computations have effects if they might interfere with others.

One of the goals of the work has been to perform compiler
optimisations without having to use IO-monad style state threading.
IO is very coarse grained, and using the IO monad for everything
tends to introduce more data-dependencies than strictly needed, which
limits what optimisations you can do.

Non-determinism and continuations are tricker things than the simple
notion of effects-as-interference, which I haven't got a good
solution for.

Ben.


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Ben Lippmeier

Dan Doel wrote:
Off hand, I'd say I don't write foo and fooM versions of functions much in 
actual programs, either. Such duplication goes into libraries...

It would be ok if the duplication /was/ actually in the libraries,
but often it's not.

Note the lack of Data.Map.mapM and Data.Map.foldM. Want to apply a monadic
computation to all the elements of a Data.Map? Convert it to a list and 
back..


Ben.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread John A. De Goes


Fair enough, but keep in mind an effect system does more than just  
eliminate boilerplate: it provides richer information on the precise  
nature of the interaction of a function with the real world. With the  
right insight, you can reorder and parallelize all kinds of effectful  
computations. Something that dumb monads can't provide.


I haven't played with DDC, but I do believe some new FPL with a  
powerful effect system is going to take off in the next 5 years.


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101

On Aug 12, 2009, at 8:13 PM, Dan Doel wrote:


On Wednesday 12 August 2009 9:27:30 pm John A. De Goes wrote:
So what, because effect systems might not eliminate *all*  
boilerplate,

you'd rather use boilerplate 100% of the time? :-)


For most of my Haskell programs, the majority of the program is not  
made up of
straight IO or ST functions, so how much boilerplate is it really  
eliminating?
And since all the fooM functions have to exist for all the other  
monads, how

much more boilerplate is it really to use them for IO and ST as well?

Off hand, I'd say I don't write foo and fooM versions of functions  
much in
actual programs, either. Such duplication goes into libraries, and  
that would
be the case for Disciple as well (until there's a way to extend the  
effect

system outside the compiler).

So it's more of effect systems eliminate 2% of my boilerplate; who
(currently) cares? And is it worth having situations like 'you use  
monads for
these effects, and the effect system for these other effects' in the  
language

to do so?

But, as I said, it's not that I don't think it's a fruitful area of  
research,
I just don't think it's going to yield significantly nicer code than  
Haskell

_yet_.

-- Dan


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread John A. De Goes


The next step is to distinguish between reading file A and reading  
file B, between reading file A and writing file A, between reading one  
part of file A and writing another part of file A, etc. When the  
effect system can carry that kind of information, and not just for  
files, but network, memory, etc., then you'll be able to do some  
extremely powerful parallelization  optimization.


But for now providing course grained information on the class to which  
an effect belongs is pretty interesting in its own right.


Regards,

John A. De Goes
N-Brain, Inc.
The Evolution of Collaboration

http://www.n-brain.net|877-376-2724 x 101

On Aug 12, 2009, at 9:41 PM, Ben Lippmeier wrote:


Dan Doel wrote:

For instance: what effects does disciple support? Mutation and IO?

You can create your own top-level effects which interfere
will all others, for example:

effect !Network;
effect !File;

readFile :: String -(!e) String
   :- !e = !File

Now any function that calls readFile will also have a !File effect.

What if I want non-determinism, or continuations, etc.? How do I as  
a user add those effects to the effect system, and specify how they  
should interact with the other effects? As far as I know, there  
aren't yet any provisions for this, so presumably you'll end up  
with effect system for effects supported by the compiler, and  
monads for effects you're writing yourself.



Yep.

In Disciple, a computation has an effect if its evaluation cannot
safely be reordered with others having the same effect. That is,
computations have effects if they might interfere with others.

One of the goals of the work has been to perform compiler
optimisations without having to use IO-monad style state threading.
IO is very coarse grained, and using the IO monad for everything
tends to introduce more data-dependencies than strictly needed, which
limits what optimisations you can do.

Non-determinism and continuations are tricker things than the simple
notion of effects-as-interference, which I haven't got a good
solution for.

Ben.


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Dan Doel
On Wednesday 12 August 2009 11:46:29 pm Ben Lippmeier wrote:
 Dan Doel wrote:
  Off hand, I'd say I don't write foo and fooM versions of functions much
  in actual programs, either. Such duplication goes into libraries...

 It would be ok if the duplication /was/ actually in the libraries,
 but often it's not.

 Note the lack of Data.Map.mapM and Data.Map.foldM. Want to apply a monadic
 computation to all the elements of a Data.Map? Convert it to a list and
 back..

Or use Data.Traversable.mapM and Data.Foldable.foldM.
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Ben Lippmeier

Dan Doel wrote:

On Wednesday 12 August 2009 11:46:29 pm Ben Lippmeier wrote:
  

Dan Doel wrote:


Off hand, I'd say I don't write foo and fooM versions of functions much
in actual programs, either. Such duplication goes into libraries...
  

It would be ok if the duplication /was/ actually in the libraries,
but often it's not.

Note the lack of Data.Map.mapM and Data.Map.foldM. Want to apply a monadic
computation to all the elements of a Data.Map? Convert it to a list and
back..



Or use Data.Traversable.mapM and Data.Foldable.foldM.

  

Ah thanks, I didn't notice the Traversable instance. There are
other higher-order functions in Data.Map that don't seem to have
monadic counterparts though, like insertWith, unionsWith, updateAt ...

Ben.



___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


Re: DDC compiler and effects; better than Haskell? (was Re: [Haskell-cafe] unsafeDestructiveAssign?)

2009-08-12 Thread Brandon S. Allbery KF8NH

On Aug 12, 2009, at 23:46 , Ben Lippmeier wrote:

Dan Doel wrote:
Off hand, I'd say I don't write foo and fooM versions of functions  
much in actual programs, either. Such duplication goes into  
libraries...


Note the lack of Data.Map.mapM and Data.Map.foldM. Want to apply a  
monadic
computation to all the elements of a Data.Map? Convert it to a list  
and back..



If Data.Map were a Monad, then the Monad mapM and foldM would work.   
It's not (can't represent Ord constraint on keys), so you have to  
translate into a monad to use monad functions.  I don't know if Oleg's  
restricted monads, which *can* represent Data.Map, provide something  
similar.


--
brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allb...@kf8nh.com
system administrator [openafs,heimdal,too many hats] allb...@ece.cmu.edu
electrical and computer engineering, carnegie mellon universityKF8NH




PGP.sig
Description: This is a digitally signed message part
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe