Hi Daniel,

Thanks for the quick reply

On 2018-08-07 16:56, Daniel Stenberg wrote:
On Tue, 7 Aug 2018, Rajvardhan Deshmukh via curl-library wrote:

1. I have written a code [1] to download 2 files at a time (multiplexed using 2 streams). And i intend to download 100 total files. Can you verify if i'm reusing the connection properly (concern as i use curl_easy_init(); for each url). But this doc [2] says as long as i use the same multi handle i should be good.

I looks fine, and you can verify yourself by reading the verbose
output as it states clearly when it reuses connections.

It reuses some connections and most of the times it says " * Connection 75 seems to be dead! * Closing connection 75 " and uses new one. Should i use some additional functionality that will keep the connection alive in all cases?

But I can also tell you that your program will busy-loop and use 100%
CPU for as long as there are transfers going and that is wasteful and
ineffective. Don't do curl_multi_perform() in a loop like that.

to get responses back to back (after one is downloaded request for the next) should i do something else (other than using curl_multi_perform() in a loop)? Do you mean i should include sleep so that it doesn't use the 100% of CPU?

2. I also need to identify the api which requests chunks of data (so that not all response data is written to the memory at the same time, but as multiple of chunks).

CURLOPT_WRITEFUNCTION is the *only* API you can receive data with if
you don't want to store it directly to a file. libcurl can't deliver
data in any other way than piece by piece.

Thanks for this. I'll look through the examples.

Thanks,
Raj
-------------------------------------------------------------------
Unsubscribe: https://cool.haxx.se/list/listinfo/curl-library
Etiquette:   https://curl.haxx.se/mail/etiquette.html

Reply via email to