This is a continuation of the thread 'reading an input stream' I had to walk away from for a few days due to the holidays and then other work considerations, and I figured it best to break my confusion into separate chunks, I hope that's appropriate. In short, my script needs to read a stream of xml data from a socket (port 2008), the data coming in from as many as 30 different machines, but usually 4 or less, as many as 3 messages per second from each machine at times, messages block format delimited by stx(\x02) and etx (\x03), send the data in those blocks to a parser (already built using lxml and an xslt file) and send it out to splunk using a native 'event writer'.
And finally, for now, I suspect I'll need to run each 'connection', that is the data from each sending machine on a separate thread. I'm not well versed in threading, but have read some of the documentation and it seems pretty straight forward, if I have questions I will of course ask, but the preliminary question is: Do I in fact need to do this? Is threading the way to go or is just running all the data through the same 'pipe' feasible/Pythonic/efficient? regards, Richard _______________________________________________ Tutor maillist - [email protected] To unsubscribe or change subscription options: https://mail.python.org/mailman/listinfo/tutor
