On Wed, 19 Aug 2009, Ankur Saxena wrote:
Ok. I am currently using wget to retrieve any newly added files from the ftp server (the --mirror option). But I want to switch to libcurl because I want to maintain a persistent connection (which will be more efficient). But now I am not sure if this whole process -- getting a list of files from the ftp server, parsing it and comparing it to the local files, and then downloading the files with different timestamp (or size) -- will help me save time over wget. Any comments?
Uhm, that's exactly what wget does so if you do that with a persistent connection instead of disconnect/reconnect all the time, you should be getting better performance. Of course, if you're transferring large (or few) files the difference might not be that noticable.
-- / daniel.haxx.se
