On Fri, 18 Feb 2011, Felix E. Klee wrote:

Sorry if this is off topic, may be an OS issue.

I wrote a daemon for downloading data from web-services using cURL
Multi. The daemon runs on a 32bit Amazon EC2 instance with the Amazon
Linux AMI (CentOS).

Now, I wonder: How many downloads can be run in parallel? What are the limiting factors?

libcurl itself has no actual limits apart from memory restraints.

If you use the "plain" multi interface, it will get a problem with more than 1024 sockets if libcurl was built to use select() instead of poll(), and as the API is very select()-centered it very well might cause a problem there too.

If you use that many connections (really, even you even just use them in the hundreds) you really should use the multi_socket anyway to reach proper performance: http://daniel.haxx.se/docs/poll-vs-select.html

If you use the multi_socket API, libcurl has no limitation at all.

If you do multiple connections to the same server(s), many servers of course reject too many connections coming from the same client and in general reject connections once it reaches its limit.

--

 / daniel.haxx.se
-------------------------------------------------------------------
List admin: http://cool.haxx.se/list/listinfo/curl-library
Etiquette:  http://curl.haxx.se/mail/etiquette.html

Reply via email to