[Haskell-cafe] Re: Guidance on using asynchronous exceptions

2007-11-16 Thread Simon Marlow

Yang wrote:

To follow up on my previous post (Asynchronous Exceptions and the
RealWorld), I've decided to put together something more concrete in
the hopes of eliciting response.

I'm trying to write a library of higher-level concurrency
abstractions, in particular for asynchronous systems programming. The 
principal goal here is composability and safety. Ideally, one can apply 
combinators on any existing (IO a), not just procedures written for this 
library. But that seems like a pipe dream at this point.


It's quite hard to write composable combinators using threads and 
asynchronous exceptions, and this is certainly a weakness of the design. 
See for example the timeout combinator we added recently:


http://darcs.haskell.org/packages/base/System/Timeout.hs

There we did just about manage to make timeout composable, but it was tricky.

In the code below, the running theme is process orchestration. (I've put 
TODOs at places where I'm blocked - no pun intended.)


I'm currently worried that what I'm trying to do is simply impossible in
Concurrent Haskell. I'm bewildered by the design decisions in the
asynchronous exceptions paper. I'm also wondering if there are any
efforts under way to reform this situation. I found some relevant
posts below hinting at this, but I'm not sure what the status is
today.


We haven't made any changes to block/unblock, although that's something I'd 
like to investigate at some point.  If you have any suggestions, I'd be 
interested to hear them.


The problem your code seems to be having is that waitForProcess is 
implemented as a C call, and C calls cannot be interrupted by asynchronous 
exceptions - there's just no way to implement that in general.  One 
workaround would be to fork a thread to call waitForProcess, and 
communicate with the thread using an MVar (takeMVar *is* interruptible). 
You could encapsulate this idiom as a combinator interruptible, perhaps. 
 But note that interrupting the thread waiting on the MVar won't then 
terminate the foreign call: the call will run to completion as normal.


The fact that some operations which block indefinitely cannot be 
interrupted is a problem.  We should document which those are, but the fact 
that the audit has to be done by hand means it's both tedious and 
error-prone, which is why it hasn't been done.


The only example that I know of where asynchronous exceptions and 
block/unblock are really used in anger is darcs, which tries to do 
something reasonable in response to a keyboard interrupt.


Cheers,
Simon
___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: Guidance on using asynchronous exceptions

2007-11-16 Thread Yang

oleg-at-pobox.com |haskell-cafe| wrote:

Yang wrote:

(Something like this is straightforward to build if I abandon
Concurrent Haskell and use cooperative threading, and if the
operations I wanted to perform could be done asynchronously.)

All operations could be done asynchronously, at least on Linux and
many Unixes:

http://www.usenix.org/events/usenix04/tech/general/full_papers/elmeleegy/elmeleegy_html/index.html


Thanks for this pointer.




(Something like this is straightforward to build if I abandon
Concurrent Haskell and use cooperative threading, and if the
operations I wanted to perform could be done asynchronously.)


That seems as a very good idea. You might be aware that well-respected
and well-experienced systems researchers call for *abandoning*
threads. Threads are just a very bad model.

The Problem with Threads
Edward A. Lee
http://www.eecs.berkeley.edu/Pubs/TechRpts/2006/EECS-2006-1.html
Also, IEEE computer, May 2006, pp. 33-42.

From the abstract:
``Although threads seem to be a small step from sequential computation,
in fact, they represent a huge step. They discard the most essential
and appealing properties of sequential computation: understandability,
predictability, and determinism. Threads, as a model of computation,
are wildly nondeterministic, and the job of the programmer becomes one
of pruning that nondeterminism. Although many research techniques
improve the model by offering more effective pruning, I argue that
this is approaching the problem backwards.  Rather than pruning
nondeterminism, we should build from essentially deterministic,
composable components. Nondeterminism should be explicitly and
judiciously introduced where needed, rather than removed where not
needed. The consequences of this principle are profound. I argue for
the development of concurrent coordination languages based on sound,
composable formalisms. I believe that such languages will yield much
more reliable, and more concurrent programs.''


I had read this not long ago. While the bulk of the paper argues for 
determinism, my understanding is that he ultimately doesn't actually 
advocate tossing out threads per se; he approves of their use for data 
flow (message-passing) and with state isolation.


This style of concurrency is, of course, not new. Component 
architectures where data flows
through components (rather than control) have been called 
“actor-oriented” [35]. These can take
many forms. Unix pipes resemble PN, although they are more limited in 
that they do not support
cyclic graphs. Message passing packages like MPI and OpenMP include 
facilities for implementing
rendezvous and PN, but in a less structured context that emphasizes 
expressiveness rather than
determinacy. A naive user of such packages can easily be bitten by 
unexpected nondeterminacy.
Languages such as Erlang [4] make message passing concurrency an 
integral part of a general-
purpose language. Languages such as Ada make rendezvous an integral 
part. Functional languages
[30] and single-assignment languages also emphasize deterministic 
computations, but they are less
explicitly concurrent, so controlling and exploiting concurrency can be 
more challenging. Data
parallel languages also emphasize determinate interactions, but they 
require low-level rewrites of

software.




I believe that delimited continuations is a good way to build
coordination languages, because delimited continuations let us build a
sound model of computation's interaction with its context.



Aren't safepoints (+ no shared state) enough to tame this issue? What 
visible difference is there between threads with safepoints and 
delimited continuations?


Another reason for separate threads is that they can run on separate OS 
threads (cores), thus exploiting parallelism.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: Guidance on using asynchronous exceptions

2007-11-16 Thread oleg

Yang wrote:
 Furthermore, is there any way to embed this information [about async
 execptions] in the type system, so that Haskellers don't produce
 async-exception-unaware code?  (Effectively, introducing checked
 interrupts?)

Yes, it is possible to make the information about exceptions and
interrupts a part of operation's type:
  http://www.haskell.org/pipermail/haskell/2004-June/014271.html

Here is a simple illustration for your specific case

 {-# OPTIONS -fglasgow-exts #-}

 module B where
 import Control.Exception

 idarn _ msg = error msg

 myPutStrLn _ | False = idarn ?e_async myPutStrLn is interruptible
 myPutStrLn x = putStrLn x

*B :t myPutStrLn
myPutStrLn :: (?e_async::()) = String - IO ()

Now it is obvious that myPutStrLn is subject to async interruptions

 test1 x = do myPutStrLn String
  myPutStrLn x

*B :t test1
test1 :: (?e_async::()) = String - IO ()

and so can test1. The compiler figured that out.


 myblock :: ((?e_async::()) = IO a) - IO a
 myblock x = let ?e_async = undefined -- meaning, we `handle' the exc
 in block x

If we try to 
*B test1 here

interactive:1:0:
Unbound implicit parameter (?e_async::())
  arising from use of `test1' at interactive:1:0-11
In the expression: test1 here
In the definition of `it': it = test1 here

meaning that we ought to `handle' that exceptional condition. The
typechecker will not let us overlook the test.

 -- ?e_async::t causes a problem: we really have to `handle' it
 -- main = test1 here

 test3 = myblock (test1 here)

*B :t test3
test3 :: IO ()

The type of test3 shows that the condition is `handled'. And so test3
may now be run.

 (Something like this is straightforward to build if I abandon
 Concurrent Haskell and use cooperative threading, and if the
 operations I wanted to perform could be done asynchronously.)
All operations could be done asynchronously, at least on Linux and
many Unixes:

http://www.usenix.org/events/usenix04/tech/general/full_papers/elmeleegy/elmeleegy_html/index.html

 (Something like this is straightforward to build if I abandon
 Concurrent Haskell and use cooperative threading, and if the
 operations I wanted to perform could be done asynchronously.)

That seems as a very good idea. You might be aware that well-respected
and well-experienced systems researchers call for *abandoning*
threads. Threads are just a very bad model.

The Problem with Threads
Edward A. Lee
http://www.eecs.berkeley.edu/Pubs/TechRpts/2006/EECS-2006-1.html
Also, IEEE computer, May 2006, pp. 33-42.

From the abstract:
``Although threads seem to be a small step from sequential computation,
in fact, they represent a huge step. They discard the most essential
and appealing properties of sequential computation: understandability,
predictability, and determinism. Threads, as a model of computation,
are wildly nondeterministic, and the job of the programmer becomes one
of pruning that nondeterminism. Although many research techniques
improve the model by offering more effective pruning, I argue that
this is approaching the problem backwards.  Rather than pruning
nondeterminism, we should build from essentially deterministic,
composable components. Nondeterminism should be explicitly and
judiciously introduced where needed, rather than removed where not
needed. The consequences of this principle are profound. I argue for
the development of concurrent coordination languages based on sound,
composable formalisms. I believe that such languages will yield much
more reliable, and more concurrent programs.''


I believe that delimited continuations is a good way to build
coordination languages, because delimited continuations let us build a
sound model of computation's interaction with its context.

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe


[Haskell-cafe] Re: Guidance on using asynchronous exceptions

2007-11-16 Thread Aaron Denney
On 2007-11-16, Yang [EMAIL PROTECTED] wrote (quoting a paper):
 This style of concurrency is, of course, not new. Component
 architectures where data flows through components (rather than
 control) have been called 'actor-oriented' [35]. These can take many
 forms. Unix pipes resemble PN, although they are more limited in that
 they do not support cyclic graphs.

This isn't quite true.  Unix pipes support cyclic graphs just fine.
Many programs can't handle this due to buffering (on both input and
output).  Further, most Unix shells can't set them up.  C programs,
or anything else that exposes the underlying calls, can set them up
easily enough.

-- 
Aaron Denney
--

___
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe