On Sat, 20 Jun 2009 11:29:44 +0100
Jon Harrop <j...@ffconsultancy.com> wrote:
> 
> On Saturday 20 June 2009 08:34:39 Konrad Hinsen wrote:
> > What't TPL?
> 
> The Task Parallel Library. It uses concurrent wait-free work-stealing
> queues to provide an efficient implementation of "work items" than
> can spawn other work items with automatic load balancing on shared
> memory machines. Cilk uses the same technology (well worth a look if
> you haven't already seen it!). That makes it easy to write efficient
> parallel algorithms in any .NET language. In particular, it can
> sustain billions of work items (as opposed to thousands of threads or
> processes) and the time taken to spawn is ~30,000x faster than
> forking a process. Extremely useful stuff!
> 

Interesting. It strikes me that it's called "task" parallel library,
while it sounds a lot like Intel Threading Building Blocks, which is a
sort of STL-style quasi-functional template library for *data*-parallel
algorithms; the stuff people like to write with Fortran, OpenMP and
friends.

It uses a work-stealing thread-pool scheduler as well, atop which stuff
like parallel maps and reductions are implemented as templates. You can
create your own work items and stick them in the scheduler by hand, but
the useful bits are actually the prefab algorithms, IMO.

The tunable/pluggable "slicing" strategies, built on the standard
iterator concepts, are particularly interesting way to give you full
control of work unit granularity, without having to know too much about
the innards of the algorithms themselves.

-Kyle

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to