On Wed, Jan 6, 2010 at 6:24 AM, Philip Semanchuk <phi...@semanchuk.com>wrote:

>
> On Jan 6, 2010, at 12:45 AM, Brian J Mingus wrote:
>
>  On Tue, Jan 5, 2010 at 9:36 PM, Philip Semanchuk <phi...@semanchuk.com
>> >wrote:
>>
>>
>>> On Jan 5, 2010, at 11:26 PM, aditya shukla wrote:
>>>
>>> Hello people,
>>>
>>>>
>>>> I have 5 directories corresponding 5  different urls .I want to download
>>>> images from those urls and place them in the respective directories.I
>>>> have
>>>> to extract the contents and download them simultaneously.I can extract
>>>> the
>>>> contents and do then one by one. My questions is for doing it
>>>> simultaneously
>>>> do I have to use threads?
>>>>
>>>>
>>> No. You could spawn 5 copies of wget (or curl or a Python program that
>>> you've written). Whether or not that will perform better or be easier to
>>> code, debug and maintain depends on the other aspects of your program(s).
>>>
>>> bye
>>> Philip
>>>
>>
>>
>> Obviously, spawning 5 copies of wget is equivalent to starting 5 threads.
>> The answer is 'yes'.
>>
>
> ???
>
> Process != thread


Just like the other nitpicker it is up to you to explain why the
differences, and not he similarities, are relevant to this problem.
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to