Let's say there is a new zip file with updated information every 30 minutes on a remote website. Now, I wanna connect to this website every 30 minutes, download the file, extract the information, and then have the program search the file search for certain items.
Would it be better to use threads to break this up? I have one thread download the data and then have another to actually process the data . Or would it be better to use fork? -- http://mail.python.org/mailman/listinfo/python-list