On 1 Dec 2011, at 12:57, Matthew Ford wrote: > On 1 Dec 2011, at 12:21, Michael Wood wrote: > >> On 1 December 2011 13:02, Matthew Ford <[email protected]> wrote: >>> Hi, >>> >>> I'm trying to build on the 10-at-a-time.c example code >>> (http://curl.haxx.se/libcurl/c/10-at-a-time.html) to write a small utility >>> that reads in a bunch of hostnames from a file and tries to connect to them >>> using the multi interface. >>> >>> I'm finding that this works fine up to a little over 1000 hostnames, and >>> thereafter I get output, but there is no actual connection taking place >>> behind the scenes. >> >> Check your resource limits for the number of open file descriptors. e.g.: >> >> $ ulimit -n >> 1024 >> >>> Is there an obvious reason for this limit either in the example code, or >>> some aspect of the way libcurl is built? >> >> I suspect this is your OS (resource limits) rather than libcurl or the >> example code. >> > > Bingo! >
Actually, I think I spoke to soon. I'm now getting results, but the performance is much slower once we reach 1024 hostnames. I have ulimit -n = 65535, and if I grep FDSize /proc/<PID>/status I get 2048 returned after we reach 1024 hostnames queried. Are there other limits I might be running in to? Mat ------------------------------------------------------------------- List admin: http://cool.haxx.se/list/listinfo/curl-library Etiquette: http://curl.haxx.se/mail/etiquette.html
