Excelent! Will be waiting ;)

I think a fast solution could be: scanning all command line passed URLs, and then "group"
then based on the same hostname. Then download the URLs in groups using the
same ftp connection.

Regards,
Oliver

Mauro Tortonesi wrote:

Oliver Schulze L. wrote:

Hi,
I wonder if it is posible ot make wget reuse an open ftp connection when
you pass 2 URL as parameters and both are in the same server?

For example:
wget -c ftp://ftp.example.com/pub/file1.txt ftp://ftp.example.com/pub/another-file.txt
will make 2 connections, 2 logins, etc.

The problem I have is that since wget does not support regex, I have to pass nearlly 300 parameters to wget, so the ftp I download from gets overloaded.
Not to mention that doing login/logout is a time consuming task.


i am implementing right now a connection cache mechanism which will be included in the next release of wget.


--
Oliver Schulze L.
<[EMAIL PROTECTED]>

Reply via email to