sly.I can
> >>>> extract the
> >>>> contents and do then one by one. My questions is for doing it
> >>>> simultaneously
> >>>> do I have to use threads?
> >>> No. You could spawn 5 copies of wget (or curl or a Python program that
es from those urls and place them in the respective
>>> directories.I have
>>> to extract the contents and download them simultaneously.I can
>>> extract the
>>> contents and do then one by one. My questions is for doing it
>>> simultaneously
>>> do I h
ct the contents and download them
>> simultaneously.I can extract the contents and do then one by one. My
>> questions is for doing it simultaneously do I have to use threads?
>>
>> Please point me in the right direction.
>>
>>
>> Thanks
>>
>> Aditya
&
urls and place them in the respective
directories.I have
to extract the contents and download them simultaneously.I can
extract the
contents and do then one by one. My questions is for doing it
simultaneously
do I have to use threads?
No. You could spawn 5 copies of wget (or curl or a Python
respective
directories.I have
to extract the contents and download them simultaneously.I can
extract the
contents and do then one by one. My questions is for doing it
simultaneously
do I have to use threads?
No. You could spawn 5 copies of wget (or curl or a Python program that
you've wr
gt;> > images from those urls and place them in the respective
>> > directories.I have
>> > to extract the contents and download them simultaneously.I can
>> > extract the
>> > contents and do then one by one. My questions is for doing it
>> >
e
> > directories.I have
> > to extract the contents and download them simultaneously.I can
> > extract the
> > contents and do then one by one. My questions is for doing it
> > simultaneously
> > do I have to use threads?
>
> No. You could spawn 5 copies of
and do then one by one. My questions is for doing it
simultaneously
do I have to use threads?
Please point me in the right direction.
See Twisted,
http://twistedmatrix.com/
in particular, Twisted Web's asynchronous HTTP client,
http://twistedmatrix.com/documents/current/web/
nts and download them simultaneously.I can extract
>>>> the
>>>> contents and do then one by one. My questions is for doing it
>>>> simultaneously
>>>> do I have to use threads?
>>>>
>>>>
>>> No. You could spawn 5 copies of wget (or
the respective
directories.I have
to extract the contents and download them simultaneously.I can
extract the
contents and do then one by one. My questions is for doing it
simultaneously
do I have to use threads?
No. You could spawn 5 copies of wget (or curl or a Python program
that
you
. My
questions is for doing it simultaneously do I have to use threads?
Please point me in the right direction.
Thanks
Aditya
You've been given some bad advice here.
First -- threads are lighter-weight than processes, so threads are
probably *more* efficient. However, with only five t
Thanks.i will look into multiprocessing.
Aditya
--
http://mail.python.org/mailman/listinfo/python-list
ctive directories.I have
>> to extract the contents and download them simultaneously.I can extract the
>> contents and do then one by one. My questions is for doing it
>> simultaneously
>> do I have to use threads?
>>
>
> No. You could spawn 5 copies of wget (or curl or
y.I can extract the
> contents and do then one by one. My questions is for doing it simultaneously
> do I have to use threads?
>
> Please point me in the right direction.
>
> Threads in python are very easy to work with but not very efficient and for
most cases slower than running
contents and do then one by one. My questions is for doing it
simultaneously
do I have to use threads?
No. You could spawn 5 copies of wget (or curl or a Python program that
you've written). Whether or not that will perform better or be easier
to code, debug and maintain depends o
it simultaneously
do I have to use threads?
Please point me in the right direction.
Thanks
Aditya
--
http://mail.python.org/mailman/listinfo/python-list
16 matches
Mail list logo