On Wed, Jul 16, 2014 at 6:32 AM, Charles Hixson
wrote:
> from queue import Empty, Full
Not sure what this is for, you never use those names (and I don't have
a 'queue' module to import from). Dropped that line. In any case, I
don't think it's your problem...
> if __name__ == "__main__":
> db
I don't think I can reduce it much beyond this. I'm trying to run
Sqlite in a separate process, but I'm running into problems.
*The code:*
from collectionsimportnamedtuple
from multiprocessing import Process, Queue, current_process
from queue import Empty, Full
Msg=namedtuple (
Hello Matt
I think the problem is here:
for n in xrange(10):
outqueue.put(str(n))<-- fill the queue with 10
elements
try:
r = inqueue.get_nowait() <-- queue is still empty because
processes need some time to start
r
If the main process doesn't get the results from the queue until the
worker processes terminate, and the worker processes don't terminate
until they've put their results in the queue, and the pipe consequently
fills up, then deadlock can result.
The queue never fills up... on platforms with qsiz
Matt Chaput wrote:
Hi,
I'm having a problem with the multiprocessing package.
I'm trying to use a simple pattern where a supervisor object starts a
bunch of worker processes, instantiating them with two queues (a job
queue for tasks to complete and an results queue for the results). The
supe
On 3/2/2010 3:59 PM, Matt Chaput wrote:
> I'm trying to use a simple pattern where a supervisor object starts a
> bunch of worker processes, instantiating them with two queues (a job
> queue for tasks to complete and an results queue for the results). The
> supervisor puts all the jobs in the "job
Hi,
I'm having a problem with the multiprocessing package.
I'm trying to use a simple pattern where a supervisor object starts a
bunch of worker processes, instantiating them with two queues (a job
queue for tasks to complete and an results queue for the results). The
supervisor puts all the
Wu Zhe wrote:
I am writing a server program with one producer and multiple consumers,
what confuses me is only the first task producer put into the queue gets
consumed, after which tasks enqueued no longer get consumed, they remain
in the queue forever.
from multiprocessing import Process, Pool,
> Wu Zhe (WZ) wrote:
>WZ> I am writing a server program with one producer and multiple consumers,
>WZ> what confuses me is only the first task producer put into the queue gets
>WZ> consumed, after which tasks enqueued no longer get consumed, they remain
>WZ> in the queue forever.
>WZ> from m
I am writing a server program with one producer and multiple consumers,
what confuses me is only the first task producer put into the queue gets
consumed, after which tasks enqueued no longer get consumed, they remain
in the queue forever.
from multiprocessing import Process, Pool, Queue, cpu_cou
10 matches
Mail list logo