Michele Simionato wrote:
> I have a question about the 'send' function:
>
> fun send (stream, x) =
> let
> val new = Promise.promise ()
> val old = Ref.exchange (stream, new)
> in
> Promise.fulfill (old, x::Promise.future new)
> handle Promise.Promise =>
> (Promise.fulfill (new, nil); raise Closed)
> (*FIXME: this is not thread-safe! *)
> end
>
> What does it mean "this is not thread-safe"?
The handling of closed streams is incorrect. If you look at the `close'
function, it simply puts in a promise pre-fulfilled with nil. Now, if
`send' encounters such a stream, it will
1. Replace it with a fresh promise
2. Try to fulfill the old promise with a cons value, which fails
3. Hence it knows that the old was nil, so just puts in nil in the new one
Unfortunately, things can go wrong when another concurrent send is
initiated between steps 1 and 3. The other thread will briefly see an
unfulfilled promise, successfully fulfill it and put in another fresh
promise. This has two bogus consequences:
1. The fulfill(new, nil) in step 3 of the first thread will fail with
a(nother) Promise exception, which is not caught.
2. Worse, consecutive send operations will complete successfully
(because they see fresh promises), but the sent values will just
disappear (because the actual output list already was terminated with nil).
It is slightly embarrassing that this code is in the code base. :-) The
reason it is in there is that somebody added closing as an afterthought
without thoroughly thinking through the consequences. As you can see,
stateful concurrent programming is hard.
One way to fix it is flagging closed streams with a separate boolean
reference, basically serving as an explicit lock. A more subtle version
is to use a promise of a promise as follows (untested):
fun send (stream, x) =
let
val p = promise () : 'a list promise promise
val new = promise ()
val old = Ref.exchange (stream, future p) (*) lock
in
(fulfill (old, x::future new) (*) blocks as long as old is
locked
handle Promise =>
(fulfill (new, nil); raise Closed)
) finally fulfill (p, new) (*) unlock
end
Alternatively, just remove the close operation.
> In general, is there a rule of thumb to
> know which operations are
> thread safe and which are not (the rule could be everything is unsafe
> unless stated otherwise,
> of course, but I hope not).
In general, thread safety means that the intended invariants/properties
of an abstraction cannot be undermined by any combination of concurrent
activity. It thus depends on the individual abstraction or data
structure and its intention. Hence there cannot be a general rule of
thumb, AFAICS.
> Also, could you suggest an understandable
> reference describing Alice concurrency
> model (I mean something for programmers, not a research paper)?
Mh, the "Alice through the looking glass" paper should provide a
relatively readable informal description. Then there are the manual
pages and the Tour. For some more examples of concrete data structures,
see Gert Smolka's lecure slides for ISSCCL'99. I'm afraid there isn't
any more comprehensive material on Alice ML concurrency. But it has its
roots in Concurrent Logic Programming, and many examples are easy to
translate, see e.g. Chris Rathmann's translation of code from van
Roy/Haridi's CTM book:
http://www.codepoetics.com/wiki/index.php?title=Topics:CTM_in_other_languages:Alice_ML
>
> BTW, I have also written my first lazy-list-valued procedure:
>
> (*converts a binary file into a lazy list of chunks*)
> fun lazy binfile (fname, chunksize) = let
> val inp = BinIO.openIn fname
> fun loop acc = let
> val vec = BinIO.inputN (inp, chunksize)
> in
> if Word8Vector.length(vec) = 0 then (BinIO.closeIn inp; rev acc)
> else loop (vec :: acc)
> end
> in
> loop []
> end
>
> val s = app (print o Byte.bytesToString) (binfile (myfile, 1024))
>
> It seems fine, but I just wanted to make sure that my expectations are
> correct.
No, this doesn't what you want. The only amount of laziness it provides
is that the list is only built when you request it. But once you do, it
is computed as a whole, because the inner loop isn't lazy at all. Also
note that the list cannot be consumed lazily if you need to reverse it
to get the first chunk.
Here is a more appropriate version:
fun lazyRead (name, chunkSize) =
let
val file = BinIO.inputIn name
fun lazy loop () =
let
val chunk = BinIO.inputN (file, chunkSize)
in
if Word8Vector.length chunk = 0
then (BinIO.closeIn file; nil)
else chunk :: loop ()
end
in
loop ()
end
> I expect "app" to loop
> on the lazy list without first building it entirely in memory, right?
> Idem for" fold", correct?
Yes (but not foldr, appr, mapr, of course).
> On the other hand, "map" would return a regular list, unless I define
> a lazy version of it, as
> described in the tutorial. I assume lazy versions of "map" and
> possibly other utilities are
> somewhere in the Alice library, right? Where should I look?
Actually, they aren't, although admittedly, they should.
Cheers,
- Andreas
_______________________________________________
alice-users mailing list
[email protected]
http://www.ps.uni-sb.de/mailman/listinfo/alice-users