[issue9897] multiprocessing problems

2010-09-19 Thread hume

New submission from hume hume...@gmail.com:

when use multiprocessing managers, while use socket to communicate between 
server process and client process, if I used the global socket timeout 
feature(no matter how large the value is) the client will always say

  File c:\python27\lib\multiprocessing\connection.py, line 149, in Client
answer_challenge(c, authkey)
  File c:\python27\lib\multiprocessing\connection.py, line 383, in 
answer_challenge
message = connection.recv_bytes(256) # reject large message
IOError: [Errno 10035] 

this is not reasonable, because this behaviour will make subprocess unable to 
use socket's timeout features globally.

Another question is line 138 in managers.py:
# do authentication later
self.listener = Listener(address=address, backlog=5)
self.address = self.listener.address

backlog=5 will accept only 5 cocurrent connections, this is not so user 
friendly, you'd better make this a argument that can be specified by user

--
components: Library (Lib)
messages: 116854
nosy: hume
priority: normal
severity: normal
status: open
title: multiprocessing problems
versions: Python 2.7

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue9897
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue9897] multiprocessing problems

2010-09-19 Thread hume

hume hume...@gmail.com added the comment:

ok, I refill this to fix the un-plesant words problem.

--

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue9897
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue9851] multiprocessing socket timeout will break client

2010-09-16 Thread hume

hume hume...@gmail.com added the comment:

Oh, it's obvious that you've found a stupid bug in my description, if that 
frustrate you, I'd like to say sorry. 

So I would restate: this is not so user friendly, would you be kindly enough 
to fix it?

--

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue9851
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue9851] multiprocessing socket timeout will break client

2010-09-14 Thread hume

New submission from hume hume...@gmail.com:

when use multiprocessing managers, while use socket to communicate between 
server process and client process, if I used the global socket timeout 
feature(no matter how large the value is) the client will always say

  File c:\python27\lib\multiprocessing\connection.py, line 149, in Client
answer_challenge(c, authkey)
  File c:\python27\lib\multiprocessing\connection.py, line 383, in 
answer_challenge
message = connection.recv_bytes(256) # reject large message
IOError: [Errno 10035] 

this is not reasonable, because this behaviour will make subprocess unable to 
use socket's timeout features globally.

Another question is line 138 in managers.py:
# do authentication later
self.listener = Listener(address=address, backlog=5)
self.address = self.listener.address

backlog=5 will accept only 5 cocurrent connections, this is stupid, you'd 
better make this a argument that can be specified by user

--
components: Library (Lib)
messages: 116375
nosy: hume
priority: normal
severity: normal
status: open
title: multiprocessing socket timeout will break client
versions: Python 2.7

___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue9851
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com