I have been using the 'threading' library and decided to try swapping it out for 'processing'... while it's awesome that processing so closely mirrors the threading interface, I've been having trouble getting my processes to share an object in a similar way.
Using the 'with' keyword didn't work, and using normal locks doesn't result in the expected behavior (I can get an object to be accessible in more than one process, and Python indicates that the instances are living at the same address in memory, but changes in one process are not reflected in the other[s]). I'm sure this is because my expectations are incorrect. :) The problem, as briefly as possible: I have three processes which need to safely read and update two objects. I've been working with processing, multiprocessing, and parallel python, trying to get this working... I suspect it can be accomplished with managers and/or queues, but if there's an elegant way to handle it, I have thus far failed to understand it. I don't particularly care which library I use; if someone has done this or can recommend a good method they're aware of, it'd be incredibly helpful. Thank you! -- http://mail.python.org/mailman/listinfo/python-list