Quoting Cameron Simpson :
> | And as it happens I have an IterableQueue class right here which does
> | _exact_ what was just described. You're welcome to it if you like.
> | Added bonus is that, as the name suggests, you can use the class as
> | an iterator:
> | for item in iterq:
> | ...
On Monday 04 May 2009 04:01:23 am Hendrik van Rooyen wrote:
> This will form a virtual (or real if you have different machines)
> systolic array with producers feeding consumers that feed
> the summary process, all running concurrently.
Nah, I can't do that. The summary process is expensive, but
"Luis Alberto Zarrabeitia Gomez" wrote:
>Quoting Hendrik van Rooyen :
>> In fact I happen to believe that anything that does any work needs
>> one and only one input queue and nothing else, but I am peculiar
>> that way.
>
>Well, I also need some output. In my case, the outputs are files with th
> You may have to write the consumer loop by hand, rather than using
> 'for'. In the same-process case, you can do this.
>
> producer:
> sentinel= object( )
>
> consumer:
> while True:
> item= queue.get( )
> if item is sentinel:
> break
> etc.
>
> Then, each consumer is guaranteed to con
Quoting Dennis Lee Bieber :
> I'm not familiar with the multiprocessing module and its queues but,
> presuming it behaves similar to the threading module AND that you have
> design control over the consumers (as you did in the sample) make a
> minor change.
>
> queue.put(None) ONCE i
Quoting Hendrik van Rooyen :
> "Luis Zarrabeitia" wrote:
>
> 8< ---explanation and example of one producer,
> 8< ---more consumers and one queue
>
> >As you can see, I'm sending one 'None' per consumer, and hoping that no
> >consumer will read more than
Hendrik van Rooyen wrote:
: "Roel Schroeven" wrote:
Hendrik van Rooyen schreef:
I have always wondered why people do the one queue many getters thing.
Because IMO it's the simplest and most elegant solution.
That is fair enough...
Given that the stuff you pass is hom
Hendrik van Rooyen schreef:
> : "Roel Schroeven" wrote:
>> ...
> This is all true in the case of a job that starts, runs and finishes.
> I am not so sure it applies to something that has a long life.
It's true that I'm talking about work units with relatively short
lifetimes, mostly a few secon
: "Roel Schroeven" wrote:
> Hendrik van Rooyen schreef:
> > I have always wondered why people do the one queue many getters thing.
>
> Because IMO it's the simplest and most elegant solution.
That is fair enough...
> >
> > Given that the stuff you pass is homogenous in that it will require a
Hendrik van Rooyen schreef:
> I have always wondered why people do the one queue many getters thing.
Because IMO it's the simplest and most elegant solution.
>
> Given that the stuff you pass is homogenous in that it will require a
> similar amount of effort to process, is there not a case to be
On Apr 30, 3:49 pm, Luis Zarrabeitia wrote:
> Hi. I'm building a script that closely follows a producer-consumer model. In
> this case, the producer is disk-bound and the consumer is cpu-bound, so I'm
> using the multiprocessing module (python2.5 with the multiprocessing backport
> from google.cod
"Luis Zarrabeitia" wrote:
8< ---explanation and example of one producer,
8< ---more consumers and one queue
>As you can see, I'm sending one 'None' per consumer, and hoping that no
>consumer will read more than one None. While this particular implementatio
On 01May2009 08:37, I wrote:
| On 30Apr2009 22:57, MRAB wrote:
| > The producer could send just one None to indicate that it has finished
| > producing.
| > Each consumer could get the data from the queue, but if it's None then
| > put it back in the queue for the other consumer, then clean up and
On 30Apr2009 22:57, MRAB wrote:
> Luis Zarrabeitia wrote:
>> The problem: when there is no more data to process, how can I signal
>> the consumers to consume until the queue is empty and then stop
>> consuming? I need them to do some clean-up work after they finish (and
>> then I need the main
Luis Zarrabeitia wrote:
Hi. I'm building a script that closely follows a producer-consumer model. In
this case, the producer is disk-bound and the consumer is cpu-bound, so I'm
using the multiprocessing module (python2.5 with the multiprocessing backport
from google.code) to speed up the proces
Hi. I'm building a script that closely follows a producer-consumer model. In
this case, the producer is disk-bound and the consumer is cpu-bound, so I'm
using the multiprocessing module (python2.5 with the multiprocessing backport
from google.code) to speed up the processing (two consumers, one
16 matches
Mail list logo