Re: Sharing objects between processes

2009-03-09 Thread ET
 Message: 1
 Date: Sun, 08 Mar 2009 18:47:09 +
 From: Tim Golden m...@timgolden.me.uk
 Subject: Re: Sharing objects between processes
 Cc: python-list@python.org
 Message-ID: 49b412ad.1030...@timgolden.me.uk
 Content-Type: text/plain; charset=ISO-8859-1; format=flowed
 
 ET wrote:
  Using the 'with' keyword didn't work...
 
 Just an aside here for any multiprocessing maintainers
 watching ;) . I expect that the didn't work here
 refers to this bug:
 
   http://bugs.python.org/issue5261
 
 Altho' if the OP cares to clarify, it might be something
 else.
 
 TJG

Sorry; unfortunately, it's been a while since I ran into this and don't
recall what the particular problem was (there's been too much rewriting
with different libraries since then to quickly reproduce this issue).

--
http://mail.python.org/mailman/listinfo/python-list


Re: Sharing objects between processes

2009-03-09 Thread ET
 Message: 2
 Date: Sun, 8 Mar 2009 12:00:40 -0700 (PDT)
 From: Aaron Brady castiro...@gmail.com
 Subject: Re: Sharing objects between processes
 To: python-list@python.org
 Message-ID:
   5514c3df-d74e-47d8-93fc-34dd5119e...@c11g2000yqj.googlegroups.com
 Content-Type: text/plain; charset=ISO-8859-1
 
 On Mar 8, 1:36?pm, ET p...@2drpg.org wrote:
  I have been using the 'threading' library and decided to try swapping it
  out for 'processing'... while it's awesome that processing so closely
  mirrors the threading interface, I've been having trouble getting my
  processes to share an object in a similar way.
 
  Using the 'with' keyword didn't work, and using normal locks doesn't
  result in the expected behavior (I can get an object to be accessible in
  more than one process, and Python indicates that the instances are
  living at the same address in memory, but changes in one process are not
  reflected in the other[s]). ?I'm sure this is because my expectations
  are incorrect. :)
 
  The problem, as briefly as possible:
  I have three processes which need to safely read and update two objects.
 
  I've been working with processing, multiprocessing, and parallel python,
  trying to get this working... I suspect it can be accomplished with
  managers and/or queues, but if there's an elegant way to handle it, I
  have thus far failed to understand it.
 
  I don't particularly care which library I use; if someone has done this
  or can recommend a good method they're aware of, it'd be incredibly
  helpful.
 
  Thank you!
 
 There is POSH: Python Object Sharing, which I learned about a while
 ago, but never used much.
 
 http://poshmodule.sourceforge.net/
 
 It's UNIX only.

Thanks, I'll definitely keep that link handy... unfortunately, this
particular project needs to run on Windows as well as Linux-based
systems.

--
http://mail.python.org/mailman/listinfo/python-list


Re: Sharing objects between processes

2009-03-09 Thread Aaron Brady
On Mar 9, 12:47 pm, ET p...@2drpg.org wrote:
  Message: 2
  Date: Sun, 8 Mar 2009 12:00:40 -0700 (PDT)
  From: Aaron Brady castiro...@gmail.com
  Subject: Re: Sharing objects between processes
  To: python-l...@python.org
  Message-ID:
     5514c3df-d74e-47d8-93fc-34dd5119e...@c11g2000yqj.googlegroups.com
  Content-Type: text/plain; charset=ISO-8859-1

  On Mar 8, 1:36?pm, ET p...@2drpg.org wrote:
   I have been using the 'threading' library and decided to try swapping it
   out for 'processing'... while it's awesome that processing so closely
   mirrors the threading interface, I've been having trouble getting my
   processes to share an object in a similar way.

   Using the 'with' keyword didn't work, and using normal locks doesn't
   result in the expected behavior (I can get an object to be accessible in
   more than one process, and Python indicates that the instances are
   living at the same address in memory, but changes in one process are not
   reflected in the other[s]). ?I'm sure this is because my expectations
   are incorrect. :)

   The problem, as briefly as possible:
   I have three processes which need to safely read and update two objects.

   I've been working with processing, multiprocessing, and parallel python,
   trying to get this working... I suspect it can be accomplished with
   managers and/or queues, but if there's an elegant way to handle it, I
   have thus far failed to understand it.

   I don't particularly care which library I use; if someone has done this
   or can recommend a good method they're aware of, it'd be incredibly
   helpful.

   Thank you!

  There is POSH: Python Object Sharing, which I learned about a while
  ago, but never used much.

 http://poshmodule.sourceforge.net/

  It's UNIX only.

 Thanks, I'll definitely keep that link handy... unfortunately, this
 particular project needs to run on Windows as well as Linux-based
 systems.

I don't recall whether there was anything in the source that's a deal-
breaker on Windows.  The source is open, as you could see.

Other possibilities are 'shelve' and any database... fixed-length
pickles, a directory of pickles, etc.  Maybe 'multiprocessing' would
work for your synchronization, while you use a more custom technique
for data exchange.

The only other thing I can do is bring to your attention an idea of
mine for sharing primitives.  It's in the drawing board stage if you
want to help.

Of course, it's only after you turned down 'multiprocessing' and
'POSH'.  It does things they don't and vice versa.
--
http://mail.python.org/mailman/listinfo/python-list


Re: Sharing objects between processes

2009-03-09 Thread ET
On Mon, 2009-03-09 at 11:04 -0700, Aaron Brady wrote:
 On Mar 9, 12:47 pm, ET p...@2drpg.org wrote:
   Message: 2
   Date: Sun, 8 Mar 2009 12:00:40 -0700 (PDT)
   From: Aaron Brady castiro...@gmail.com
   Subject: Re: Sharing objects between processes
   To: python-l...@python.org
   Message-ID:
  5514c3df-d74e-47d8-93fc-34dd5119e...@c11g2000yqj.googlegroups.com
   Content-Type: text/plain; charset=ISO-8859-1
 
   On Mar 8, 1:36?pm, ET p...@2drpg.org wrote:
I have been using the 'threading' library and decided to try swapping it
out for 'processing'... while it's awesome that processing so closely
mirrors the threading interface, I've been having trouble getting my
processes to share an object in a similar way.
 
Using the 'with' keyword didn't work, and using normal locks doesn't
result in the expected behavior (I can get an object to be accessible in
more than one process, and Python indicates that the instances are
living at the same address in memory, but changes in one process are not
reflected in the other[s]). ?I'm sure this is because my expectations
are incorrect. :)
 
The problem, as briefly as possible:
I have three processes which need to safely read and update two objects.
 
I've been working with processing, multiprocessing, and parallel python,
trying to get this working... I suspect it can be accomplished with
managers and/or queues, but if there's an elegant way to handle it, I
have thus far failed to understand it.
 
I don't particularly care which library I use; if someone has done this
or can recommend a good method they're aware of, it'd be incredibly
helpful.
 
Thank you!
 
   There is POSH: Python Object Sharing, which I learned about a while
   ago, but never used much.
 
  http://poshmodule.sourceforge.net/
 
   It's UNIX only.
 
  Thanks, I'll definitely keep that link handy... unfortunately, this
  particular project needs to run on Windows as well as Linux-based
  systems.
 
 I don't recall whether there was anything in the source that's a deal-
 breaker on Windows.  The source is open, as you could see.
 
 Other possibilities are 'shelve' and any database... fixed-length
 pickles, a directory of pickles, etc.  Maybe 'multiprocessing' would
 work for your synchronization, while you use a more custom technique
 for data exchange.
 
 The only other thing I can do is bring to your attention an idea of
 mine for sharing primitives.  It's in the drawing board stage if you
 want to help.
 
 Of course, it's only after you turned down 'multiprocessing' and
 'POSH'.  It does things they don't and vice versa.
 --
 http://mail.python.org/mailman/listinfo/python-list

I assumed it wouldn't work in Windows as you mentioned it was UNIX-only;
the readme also states that it's for POSIX systems only.

I'd be more than happy to use multiprocessing; I've attempted to do so.
My question is largely how to implement it, as I have not managed to get
it working despite several attempts from different angles.

Unfortunately, I do need to handle more than primitives, otherwise I'd
have attempted to use the shared ctypes present in at least one of
processing/multiprocessing/parallel python.

--
http://mail.python.org/mailman/listinfo/python-list


Re: Sharing objects between processes

2009-03-09 Thread Aaron Brady
On Mar 9, 2:17 pm, ET p...@2drpg.org wrote:
 On Mon, 2009-03-09 at 11:04 -0700, Aaron Brady wrote:
  On Mar 9, 12:47 pm, ET p...@2drpg.org wrote:
Message: 2
Date: Sun, 8 Mar 2009 12:00:40 -0700 (PDT)
From: Aaron Brady castiro...@gmail.com
Subject: Re: Sharing objects between processes
To: python-l...@python.org
Message-ID:
   5514c3df-d74e-47d8-93fc-34dd5119e...@c11g2000yqj.googlegroups.com
Content-Type: text/plain; charset=ISO-8859-1

On Mar 8, 1:36?pm, ET p...@2drpg.org wrote:
 I have been using the 'threading' library and decided to try swapping 
 it
 out for 'processing'... while it's awesome that processing so closely
 mirrors the threading interface, I've been having trouble getting my
 processes to share an object in a similar way.

 Using the 'with' keyword didn't work, and using normal locks doesn't
 result in the expected behavior (I can get an object to be accessible 
 in
 more than one process, and Python indicates that the instances are
 living at the same address in memory, but changes in one process are 
 not
 reflected in the other[s]). ?I'm sure this is because my expectations
 are incorrect. :)

 The problem, as briefly as possible:
 I have three processes which need to safely read and update two 
 objects.

 I've been working with processing, multiprocessing, and parallel 
 python,
 trying to get this working... I suspect it can be accomplished with
 managers and/or queues, but if there's an elegant way to handle it, I
 have thus far failed to understand it.

 I don't particularly care which library I use; if someone has done 
 this
 or can recommend a good method they're aware of, it'd be incredibly
 helpful.

 Thank you!

There is POSH: Python Object Sharing, which I learned about a while
ago, but never used much.

   http://poshmodule.sourceforge.net/

It's UNIX only.

   Thanks, I'll definitely keep that link handy... unfortunately, this
   particular project needs to run on Windows as well as Linux-based
   systems.

  I don't recall whether there was anything in the source that's a deal-
  breaker on Windows.  The source is open, as you could see.

  Other possibilities are 'shelve' and any database... fixed-length
  pickles, a directory of pickles, etc.  Maybe 'multiprocessing' would
  work for your synchronization, while you use a more custom technique
  for data exchange.

  The only other thing I can do is bring to your attention an idea of
  mine for sharing primitives.  It's in the drawing board stage if you
  want to help.

  Of course, it's only after you turned down 'multiprocessing' and
  'POSH'.  It does things they don't and vice versa.
  --
 http://mail.python.org/mailman/listinfo/python-list

 I assumed it wouldn't work in Windows as you mentioned it was UNIX-only;
 the readme also states that it's for POSIX systems only.

 I'd be more than happy to use multiprocessing; I've attempted to do so.
 My question is largely how to implement it, as I have not managed to get
 it working despite several attempts from different angles.

 Unfortunately, I do need to handle more than primitives, otherwise I'd
 have attempted to use the shared ctypes present in at least one of
 processing/multiprocessing/parallel python.

Here's what we have to work with from you:

 The problem, as briefly as possible:
 I have three processes which need to safely read and update two 
 objects.

Can they be subprocesses of eachother?  That is, can one master spawn
the others as you desire?  Can you have one process running, and
connect to it with sockets, pipes, (mailslots,) etc., and just give
and get the information to it?  Then, synch. is a lot easier.

Do you need MROW multiple-reader one-writer synchro., or can they all
go one at a time?  Is deadlock a concern?  Can you use OBL(OE) one big
lock over everything, or do you need individual locks on elements of
the data structure?

Can you use fire-and-forget access, or do you need return values from
your calls?  Do you need to wait for completion of anything?

'xmlrpc': remote procedure calls might pertain.
--
http://mail.python.org/mailman/listinfo/python-list


Re: Sharing objects between processes

2009-03-09 Thread ET
On Mon, 2009-03-09 at 13:58 -0700, Aaron Brady wrote:
 On Mar 9, 2:17 pm, ET p...@2drpg.org wrote:
  On Mon, 2009-03-09 at 11:04 -0700, Aaron Brady wrote:
   On Mar 9, 12:47 pm, ET p...@2drpg.org wrote:
 Message: 2
 Date: Sun, 8 Mar 2009 12:00:40 -0700 (PDT)
 From: Aaron Brady castiro...@gmail.com
 Subject: Re: Sharing objects between processes
 To: python-l...@python.org
 Message-ID:
5514c3df-d74e-47d8-93fc-34dd5119e...@c11g2000yqj.googlegroups.com
 Content-Type: text/plain; charset=ISO-8859-1
 
 On Mar 8, 1:36?pm, ET p...@2drpg.org wrote:
  I have been using the 'threading' library and decided to try 
  swapping it
  out for 'processing'... while it's awesome that processing so 
  closely
  mirrors the threading interface, I've been having trouble getting my
  processes to share an object in a similar way.
 
  Using the 'with' keyword didn't work, and using normal locks doesn't
  result in the expected behavior (I can get an object to be 
  accessible in
  more than one process, and Python indicates that the instances are
  living at the same address in memory, but changes in one process 
  are not
  reflected in the other[s]). ?I'm sure this is because my 
  expectations
  are incorrect. :)
 
  The problem, as briefly as possible:
  I have three processes which need to safely read and update two 
  objects.
 
  I've been working with processing, multiprocessing, and parallel 
  python,
  trying to get this working... I suspect it can be accomplished with
  managers and/or queues, but if there's an elegant way to handle it, 
  I
  have thus far failed to understand it.
 
  I don't particularly care which library I use; if someone has done 
  this
  or can recommend a good method they're aware of, it'd be incredibly
  helpful.
 
  Thank you!
 
 There is POSH: Python Object Sharing, which I learned about a while
 ago, but never used much.
 
http://poshmodule.sourceforge.net/
 
 It's UNIX only.
 
Thanks, I'll definitely keep that link handy... unfortunately, this
particular project needs to run on Windows as well as Linux-based
systems.
 
   I don't recall whether there was anything in the source that's a deal-
   breaker on Windows.  The source is open, as you could see.
 
   Other possibilities are 'shelve' and any database... fixed-length
   pickles, a directory of pickles, etc.  Maybe 'multiprocessing' would
   work for your synchronization, while you use a more custom technique
   for data exchange.
 
   The only other thing I can do is bring to your attention an idea of
   mine for sharing primitives.  It's in the drawing board stage if you
   want to help.
 
   Of course, it's only after you turned down 'multiprocessing' and
   'POSH'.  It does things they don't and vice versa.
   --
  http://mail.python.org/mailman/listinfo/python-list
 
  I assumed it wouldn't work in Windows as you mentioned it was UNIX-only;
  the readme also states that it's for POSIX systems only.
 
  I'd be more than happy to use multiprocessing; I've attempted to do so.
  My question is largely how to implement it, as I have not managed to get
  it working despite several attempts from different angles.
 
  Unfortunately, I do need to handle more than primitives, otherwise I'd
  have attempted to use the shared ctypes present in at least one of
  processing/multiprocessing/parallel python.
 
 Here's what we have to work with from you:
 
  The problem, as briefly as possible:
  I have three processes which need to safely read and update two 
  objects.
 
 Can they be subprocesses of eachother?  That is, can one master spawn
 the others as you desire?  Can you have one process running, and
 connect to it with sockets, pipes, (mailslots,) etc., and just give
 and get the information to it?  Then, synch. is a lot easier.
 
 Do you need MROW multiple-reader one-writer synchro., or can they all
 go one at a time?  Is deadlock a concern?  Can you use OBL(OE) one big
 lock over everything, or do you need individual locks on elements of
 the data structure?
 
 Can you use fire-and-forget access, or do you need return values from
 your calls?  Do you need to wait for completion of anything?
 
 'xmlrpc': remote procedure calls might pertain.
 --
 http://mail.python.org/mailman/listinfo/python-list

My intention is to have one object which is updated by two threads, and
read by a third... one updating thread takes user input, and the other
talks to a remote server.  Which are children of the others really
doesn't matter to me; if there's a way to handle that to make this work
more efficiently, excellent.

Yes, I can use a single lock over the entire object; finer control isn't
necessary.  I also don't need to avoid blocking on reads; reading during
a write is something I also want to avoid.

I was hoping to end up with something simple

Re: Sharing objects between processes

2009-03-09 Thread Aaron Brady
On Mar 9, 4:21 pm, ET p...@2drpg.org wrote:
 On Mon, 2009-03-09 at 13:58 -0700, Aaron Brady wrote:
  On Mar 9, 2:17 pm, ET p...@2drpg.org wrote:
   On Mon, 2009-03-09 at 11:04 -0700, Aaron Brady wrote:
snip
  Here's what we have to work with from you:

   The problem, as briefly as possible:
   I have three processes which need to safely read and update two 
   objects.

  Can they be subprocesses of eachother?  That is, can one master spawn
  the others as you desire?  Can you have one process running, and
  connect to it with sockets, pipes, (mailslots,) etc., and just give
  and get the information to it?  Then, synch. is a lot easier.

  Do you need MROW multiple-reader one-writer synchro., or can they all
  go one at a time?  Is deadlock a concern?  Can you use OBL(OE) one big
  lock over everything, or do you need individual locks on elements of
  the data structure?

  Can you use fire-and-forget access, or do you need return values from
  your calls?  Do you need to wait for completion of anything?

  'xmlrpc': remote procedure calls might pertain.
  --
 http://mail.python.org/mailman/listinfo/python-list

 My intention is to have one object which is updated by two threads, and
 read by a third... one updating thread takes user input, and the other
 talks to a remote server.  Which are children of the others really
 doesn't matter to me; if there's a way to handle that to make this work
 more efficiently, excellent.

 Yes, I can use a single lock over the entire object; finer control isn't
 necessary.  I also don't need to avoid blocking on reads; reading during
 a write is something I also want to avoid.

 I was hoping to end up with something simple and readable (hence trying
 'with' and then regular locks first), but if pipes are the best way to
 do this, I'll have to start figuring those out.  If I have to resort to
 XMLRPC I'll probably ditch the process idea and leave it at threads,
 though I'd rather avoid this.

 Is there, perhaps, a sensible way to apply queues and/or managers to
 this?  Namespaces also seemed promising, but having no experience with
 these things, the doc did not get me far.

I will hack at your requirements for a while.
--
http://mail.python.org/mailman/listinfo/python-list


Re: Sharing objects between processes

2009-03-09 Thread Aaron Brady
On Mar 9, 4:21 pm, ET p...@2drpg.org wrote:
 On Mon, 2009-03-09 at 13:58 -0700, Aaron Brady wrote:
  On Mar 9, 2:17 pm, ET p...@2drpg.org wrote:
   On Mon, 2009-03-09 at 11:04 -0700, Aaron Brady wrote:
snip
  Here's what we have to work with from you:

   The problem, as briefly as possible:
   I have three processes which need to safely read and update two 
   objects.

  Can they be subprocesses of eachother?  That is, can one master spawn
  the others as you desire?  Can you have one process running, and
  connect to it with sockets, pipes, (mailslots,) etc., and just give
  and get the information to it?  Then, synch. is a lot easier.

  Do you need MROW multiple-reader one-writer synchro., or can they all
  go one at a time?  Is deadlock a concern?  Can you use OBL(OE) one big
  lock over everything, or do you need individual locks on elements of
  the data structure?

  Can you use fire-and-forget access, or do you need return values from
  your calls?  Do you need to wait for completion of anything?

  'xmlrpc': remote procedure calls might pertain.
  --
 http://mail.python.org/mailman/listinfo/python-list

 My intention is to have one object which is updated by two threads, and
 read by a third... one updating thread takes user input, and the other
 talks to a remote server.  Which are children of the others really
 doesn't matter to me; if there's a way to handle that to make this work
 more efficiently, excellent.

 Yes, I can use a single lock over the entire object; finer control isn't
 necessary.  I also don't need to avoid blocking on reads; reading during
 a write is something I also want to avoid.

 I was hoping to end up with something simple and readable (hence trying
 'with' and then regular locks first), but if pipes are the best way to
 do this, I'll have to start figuring those out.  If I have to resort to
 XMLRPC I'll probably ditch the process idea and leave it at threads,
 though I'd rather avoid this.

 Is there, perhaps, a sensible way to apply queues and/or managers to
 this?  Namespaces also seemed promising, but having no experience with
 these things, the doc did not get me far.

Here is some code and the output.  There's a risk that the 'read'
thread drops the last input from the writers.  It's not tested well
(only once).

One thread writes a key in range 1-100, and value in range 'a-j' to a
dictionary, every second.  Another thread waits for user input, and
writes it as a value with successively higher keys, starting at 101,
also to the dictionary.  The reader thread obtains a representation of
the dictionary, and appends it to a file, without sorting, also every
second.

The output is this:

{}
{87: 'd'}
{76: 'd', 87: 'd'}
{80: 'e', 76: 'd', 87: 'd'}
{80: 'e', 12: 'e', 76: 'd', 87: 'd'}
{80: 'e', 12: 'e', 71: 'h', 76: 'd', 87: 'd'}
{99: 'f', 101: 'abcd', 71: 'h', 76: 'd', 12: 'e', 80: 'e', 87: 'd'}
{99: 'f', 101: 'abcd', 102: 'efgh', 71: 'h', 76: 'd', 10: 'd', 12:
'e', 80: 'e', 87: 'd'}
{99: 'f', 101: 'abcd', 102: 'efgh', 71: 'h', 76: 'd', 10: 'd', 11:
'g', 12: 'e', 80: 'e', 87: 'd', 103: 'ijklm'}

Which is highly uninteresting.  You can see my inputs at 101, 102, and
103.  The code is this.

import threading
import time


class Globals:
cont= True
content= { }
lock_content= threading.Lock( )

def read( ):
out= open( 'temp.txt', 'w' )
while Globals.cont:
with Globals.lock_content:
rep= repr( Globals.content )
out.write( rep )
out.write( '\n' )
time.sleep( 1 )

def write_random( ):
import random
while Globals.cont:
num= random.randint( 1, 100 )
letter= random.choice( 'abcedfghij' )
with Globals.lock_content:
Globals.content[ num ]= letter
time.sleep( 1 )

def write_user_inp( ):
next_key= 101
while Globals.cont:
us_in= input( '%i-- '% next_key )
if not us_in:
Globals.cont= False
return
us_in= us_in[ :10 ]
print( 'You entered: %s'% us_in )
with Globals.lock_content:
Globals.content[ next_key ]= us_in
next_key+= 1

read_thr= threading.Thread( target= read )
read_thr.start( )
wri_rand_thr= threading.Thread( target= write_random )
wri_rand_thr.start( )
wri_user_thr= threading.Thread( target= write_user_inp )
wri_user_thr.start( )

read_thr.join( )
wri_rand_thr.join( )
wri_user_thr.join( )

Which is about the complexity of what you asked for.  It wasn't tested
with multiprocessing.  Ver 3.0.1.
--
http://mail.python.org/mailman/listinfo/python-list


Re: Sharing objects between processes

2009-03-09 Thread ET
On Mon, 2009-03-09 at 14:57 -0700, Aaron Brady wrote:\ 
 
 import threading
 import time
 
 
 class Globals:
   cont= True
   content= { }
   lock_content= threading.Lock( )
 
 def read( ):
   out= open( 'temp.txt', 'w' )
   while Globals.cont:
   with Globals.lock_content:
   rep= repr( Globals.content )
   out.write( rep )
   out.write( '\n' )
   time.sleep( 1 )
 
 def write_random( ):
   import random
   while Globals.cont:
   num= random.randint( 1, 100 )
   letter= random.choice( 'abcedfghij' )
   with Globals.lock_content:
   Globals.content[ num ]= letter
   time.sleep( 1 )
 
 def write_user_inp( ):
   next_key= 101
   while Globals.cont:
   us_in= input( '%i-- '% next_key )
   if not us_in:
   Globals.cont= False
   return
   us_in= us_in[ :10 ]
   print( 'You entered: %s'% us_in )
   with Globals.lock_content:
   Globals.content[ next_key ]= us_in
   next_key+= 1
 
 read_thr= threading.Thread( target= read )
 read_thr.start( )
 wri_rand_thr= threading.Thread( target= write_random )
 wri_rand_thr.start( )
 wri_user_thr= threading.Thread( target= write_user_inp )
 wri_user_thr.start( )
 
 read_thr.join( )
 wri_rand_thr.join( )
 wri_user_thr.join( )
 
 Which is about the complexity of what you asked for.  It wasn't tested
 with multiprocessing.  Ver 3.0.1.
 --
 http://mail.python.org/mailman/listinfo/python-list

Wow, thanks for taking the time to put that together!  Unfortunately,
I've attempted to modify it to use multiprocessing without success.
This works when the threading import is used, falls through with
multiprocessing, and runs with processing, though modifications to the
Globals class do not stick:

from __future__ import with_statement

#from threading import Thread as ControlType, Lock
#from multiprocessing import Process as ControlType, Lock
from processing import Process as ControlType, Lock

import time


class Globals:
cont= True
content= { }
lock_content= Lock( )
inputs = ['itext', 'text input', 'testing text', 'test']

def read( ):
#out= open( 'temp.txt', 'w' )
while Globals.cont:
with Globals.lock_content:
rep= repr( Globals.content )
print rep
#out.write( rep )
#out.write( '\n' )
time.sleep( 1 )

def write_random( ):
import random
while Globals.cont:
num= random.randint( 1, 100 )
letter= random.choice( 'abcedfghij' )
with Globals.lock_content:
Globals.content[ num ]= letter
time.sleep( 1 )

def write_user_inp( ):
import random
next_key= 101
while Globals.cont:
if len(Globals.inputs):
us_in = Globals.inputs.pop()
else:
Globals.cont= False
return
us_in= us_in[ :10 ]
print( 'You entered: %s'% us_in )
with Globals.lock_content:
Globals.content[ next_key ]= us_in
next_key+= 1
time.sleep( 1 )

read_thr= ControlType( target= read )
read_thr.start( )

wri_rand_thr= ControlType( target= write_random )
wri_rand_thr.start( )

wri_user_thr= ControlType( target= write_user_inp )
wri_user_thr.start( )

read_thr.join( )
wri_rand_thr.join( )
wri_user_thr.join( )


However, I think I (freaking finally) managed to get this working after
reading the syncmanager documentation for the umpteenth time; it
involves extending the SyncManager class to allow an arbitrary class as
a member (this might be possible just by calling register(...) on
SyncManager itself, but this is how the doc said to handle it).  It also
provides its own Lock implementations, which I used within the new
managed class to apply with locking:


from __future__ import with_statement

from multiprocessing import Process, Lock, current_process
from multiprocessing.managers import SyncManager

from random import choice, randint
from time import sleep

class TestClass(object):
name = 'testclassname'
lock = None

def setval(self, value):
with self.lock:
self.name = value

def getval(self):
with self.lock:
return self.name

def __init__(self, lock):
self.lock = lock

class TestManager(SyncManager):
pass

TestManager.register('TC', TestClass)

manager = TestManager()
manager.start()

class TestProcess(Process):
obj = None
new_name = None

def __init__(self, obj):
self.obj = obj

Process.__init__(self)

def run(self):
for i in 

Sharing objects between processes

2009-03-08 Thread ET
I have been using the 'threading' library and decided to try swapping it
out for 'processing'... while it's awesome that processing so closely
mirrors the threading interface, I've been having trouble getting my
processes to share an object in a similar way.

Using the 'with' keyword didn't work, and using normal locks doesn't
result in the expected behavior (I can get an object to be accessible in
more than one process, and Python indicates that the instances are
living at the same address in memory, but changes in one process are not
reflected in the other[s]).  I'm sure this is because my expectations
are incorrect. :)

The problem, as briefly as possible:
I have three processes which need to safely read and update two objects.

I've been working with processing, multiprocessing, and parallel python,
trying to get this working... I suspect it can be accomplished with
managers and/or queues, but if there's an elegant way to handle it, I
have thus far failed to understand it.

I don't particularly care which library I use; if someone has done this
or can recommend a good method they're aware of, it'd be incredibly
helpful.

Thank you!

--
http://mail.python.org/mailman/listinfo/python-list


Re: Sharing objects between processes

2009-03-08 Thread Tim Golden

ET wrote:

Using the 'with' keyword didn't work...


Just an aside here for any multiprocessing maintainers
watching ;) . I expect that the didn't work here
refers to this bug:

 http://bugs.python.org/issue5261

Altho' if the OP cares to clarify, it might be something
else.

TJG
--
http://mail.python.org/mailman/listinfo/python-list


Re: Sharing objects between processes

2009-03-08 Thread Aaron Brady
On Mar 8, 1:36 pm, ET p...@2drpg.org wrote:
 I have been using the 'threading' library and decided to try swapping it
 out for 'processing'... while it's awesome that processing so closely
 mirrors the threading interface, I've been having trouble getting my
 processes to share an object in a similar way.

 Using the 'with' keyword didn't work, and using normal locks doesn't
 result in the expected behavior (I can get an object to be accessible in
 more than one process, and Python indicates that the instances are
 living at the same address in memory, but changes in one process are not
 reflected in the other[s]).  I'm sure this is because my expectations
 are incorrect. :)

 The problem, as briefly as possible:
 I have three processes which need to safely read and update two objects.

 I've been working with processing, multiprocessing, and parallel python,
 trying to get this working... I suspect it can be accomplished with
 managers and/or queues, but if there's an elegant way to handle it, I
 have thus far failed to understand it.

 I don't particularly care which library I use; if someone has done this
 or can recommend a good method they're aware of, it'd be incredibly
 helpful.

 Thank you!

There is POSH: Python Object Sharing, which I learned about a while
ago, but never used much.

http://poshmodule.sourceforge.net/

It's UNIX only.
--
http://mail.python.org/mailman/listinfo/python-list