[issue8426] multiprocessing.Queue fails to get() very large objects

2015-11-26 Thread Serhiy Storchaka
Changes by Serhiy Storchaka : -- resolution: -> not a bug stage: -> resolved status: pending -> closed ___ Python tracker ___

[issue8426] multiprocessing.Queue fails to get() very large objects

2015-11-12 Thread Serhiy Storchaka
Serhiy Storchaka added the comment: I agree with Charles-François and think this issue should be closed. There is no a bug, and the behavior is documented. -- nosy: +serhiy.storchaka status: open -> pending ___ Python tracker

[issue8426] multiprocessing.Queue fails to get() very large objects

2011-08-28 Thread Charles-François Natali
Changes by Charles-François Natali neolo...@free.fr: -- components: +Documentation -Library (Lib) nosy: +docs@python priority: normal - low ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue8426

[issue8426] multiprocessing.Queue fails to get() very large objects

2011-08-27 Thread Vinay Sajip
Vinay Sajip vinay_sa...@yahoo.co.uk added the comment: I think it's just a documentation issue. The problem with documenting limits is that they are system-specific and, even if the current limits that Charles-François has mentioned are documented, these could become outdated. Perhaps a

[issue8426] multiprocessing.Queue fails to get() very large objects

2011-08-27 Thread Charles-François Natali
Charles-François Natali neolo...@free.fr added the comment: Avoid sending very large amounts of data via queues, as you could come up against system-dependent limits according to the operating system and whether pipes or sockets are used. You could consider an alternative strategy, such

[issue8426] multiprocessing.Queue fails to get() very large objects

2011-05-09 Thread Philip Semanchuk
Changes by Philip Semanchuk osvens...@users.sourceforge.net: -- nosy: +osvenskan ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue8426 ___ ___

[issue8426] multiprocessing.Queue fails to get() very large objects

2011-05-05 Thread Charles-François Natali
Charles-François Natali neolo...@free.fr added the comment: You can not pickle individual objects larger than 2**31. Indeed, but that's not what's happening here, the failure occurs with much smaller objects (also, note the OP's cPickle is perfectly capable of pickling these objects). The

[issue8426] multiprocessing.Queue fails to get() very large objects

2011-04-24 Thread Matt Goodman
Matt Goodman meawo...@gmail.com added the comment: You can not pickle individual objects larger than 2**31. This failure is not handled cleanly in the core module, and I suspect masked by above processes. Try piping a*(2**31) through you pipe, or pickling it to disk . . . -- nosy:

[issue8426] multiprocessing.Queue fails to get() very large objects

2011-04-19 Thread Charles-Francois Natali
Charles-Francois Natali neolo...@free.fr added the comment: IMO, it would be nice if I could ask my queue, Just what is your capacity (in bytes, not entries) anyways? I want to know how much I can put in here without worrying about whether the remote side is dequeueing. I guess I'd settle for

[issue8426] multiprocessing.Queue fails to get() very large objects

2011-04-18 Thread Brian Cain
Brian Cain brian.c...@gmail.com added the comment: Please don't close the issue. Joining aside, the basic point (But when size = 7279, the data submitted reaches 64k, so the writting thread blocks on the write syscall.) is not clear from the docs, right? IMO, it would be nice if I could ask my

[issue8426] multiprocessing.Queue fails to get() very large objects

2011-04-18 Thread Terry J. Reedy
Changes by Terry J. Reedy tjre...@udel.edu: Removed file: http://bugs.python.org/file21709/unnamed ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue8426 ___

[issue8426] multiprocessing.Queue fails to get() very large objects

2011-04-18 Thread Terry J. Reedy
Terry J. Reedy tjre...@udel.edu added the comment: Please do not send html responses, as they result in a spurious 'unnamed' file being attached. Please do suggest a specific change. Should this be changed to a doc issue? -- ___ Python tracker

[issue8426] multiprocessing.Queue fails to get() very large objects

2011-04-13 Thread Charles-Francois Natali
Charles-Francois Natali neolo...@free.fr added the comment: It's documented in http://docs.python.org/library/multiprocessing.html#multiprocessing-programming : Joining processes that use queues Bear in mind that a process that has put items in a queue will wait before terminating until all

[issue8426] multiprocessing.Queue fails to get() very large objects

2011-02-23 Thread Charles-Francois Natali
Charles-Francois Natali neolo...@free.fr added the comment: Alright, it's normal behaviour, but since it doesn't seem to be documented, it can be quite surprising. A queue works like this: - when you call queue.put(data), the data is added to a deque, which can grow and shrink forever - then a

[issue8426] multiprocessing.Queue fails to get() very large objects

2010-12-08 Thread Brian Cain
Brian Cain brian.c...@gmail.com added the comment: I don't think the problem is limited to when hundreds of megabytes are being transmitted. I believe I am experiencing a problem with the same root cause whose symptoms are slightly different. It seems like there's a threshhold which causes

[issue8426] multiprocessing.Queue fails to get() very large objects

2010-12-08 Thread Terry J. Reedy
Terry J. Reedy tjre...@udel.edu added the comment: 2.6.6 was the last bugfix release -- type: crash - behavior versions: -Python 2.6 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue8426 ___

[issue8426] multiprocessing.Queue fails to get() very large objects

2010-12-08 Thread Brian Cain
Brian Cain brian.c...@gmail.com added the comment: I was able to reproduce the problem on a more recent release. 7279 entries fails, 7278 entries succeeds. $ ./multiproc3.py on 3.1.2 (r312:79147, Apr 15 2010, 12:35:07) [GCC 4.4.3] - Linux mini 2.6.32-26-generic #47-Ubuntu SMP Wed Nov 17

[issue8426] multiprocessing.Queue fails to get() very large objects

2010-12-08 Thread Brian Cain
Brian Cain brian.c...@gmail.com added the comment: Detailed stack trace when the failure occurs (gdb_stack_trace.txt) -- Added file: http://bugs.python.org/file19983/gdb_stack_trace.txt ___ Python tracker rep...@bugs.python.org

[issue8426] multiprocessing.Queue fails to get() very large objects

2010-08-05 Thread Terry J. Reedy
Terry J. Reedy tjre...@udel.edu added the comment: By 'crash', do you actually mean 'hang'? Jesse, is it reasonable to stuff pickles of 100s of megabytes through the connections? -- nosy: +terry.reedy versions: -Python 2.6 ___ Python tracker

[issue8426] multiprocessing.Queue fails to get() very large objects

2010-08-05 Thread Jesse Noller
Jesse Noller jnol...@gmail.com added the comment: I don't know that it's unreasonable to send that much data, but it would certainly be slow, and I would not recommend it. Therefore, this is still on the list for when I have time -- ___ Python

[issue8426] multiprocessing.Queue fails to get() very large objects

2010-04-18 Thread Antoine Pitrou
Changes by Antoine Pitrou pit...@free.fr: -- assignee: - jnoller nosy: +jnoller priority: - normal versions: +Python 2.7, Python 3.1, Python 3.2 -Python 2.5 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue8426

[issue8426] multiprocessing.Queue fails to get() very large objects

2010-04-16 Thread Ian Davis
New submission from Ian Davis ian.w.da...@gmail.com: I'm trying to parallelize some scientific computing jobs using multiprocessing.Pool. I've also tried rolling my own Pool equivalent using Queues. In trying to return very large result objects from Pool.map()/imap() or via Queue.put(),