I am trying to rewrite a PERL automation which started a "monitoring" application on many machines, via RSH, and then multiplexed their collective outputs to stdout.
In production there are lots of these subprocesses but here is a simplified example what I have so far (python n00b alert!) - SNIP --------- import subprocess,select,sys speakers=[] lProc=[] for machine in ['box1','box2','box3']: p = subprocess.Popen( ('echo '+machine+';sleep 2;echo goodbye;sleep 2;echo cruel;sleep 2;echo world'), stdout=subprocess.PIPE, stderr=subprocess.STDOUT, stdin=None, universal_newlines=True ) lProc.append( p ) speakers.append( p.stdout ) while speakers: speaking = select.select( speakers, [], [], 1000 )[0] for speaker in speaking: speech = speaker.readlines() if speech: for sentence in speech: print sentence.rstrip('\n') sys.stdout.flush() # sanity check else: # EOF speakers.remove( speaker ) - SNIP --------- The problem with the above is that the subprocess buffers all its output when used like this and, hence, this automation is not informing me of much :) In PERL, "realtime" feedback was provided by setting the following: $p->stdout->blocking(0); How do I achieve this in Python ? This topic seems to have come up more than once. I am hoping that things have moved on from posts like this: http://groups.google.com/group/comp.lang.python/browse_thread/thread/5472ce95eb430002/434fa9b471009ab2?q=blocking&rnum=4#434fa9b471009ab2 as I don't really want to have to write all that ugly fork/dup/fcntl/exec code to achieve this when high-level libraries like "subprocess" really should have corresponding methods. If it makes anything simpler, I only *need* this on Linux/Unix (Windows would be a nice extra though). thanks for reading, Marc -- http://mail.python.org/mailman/listinfo/python-list