"Oliver Schulze L." <[EMAIL PROTECTED]> writes:

> I think a fast solution could be: scanning all command line passed
> URLs, and then "group" then based on the same hostname. Then
> download the URLs in groups using the same ftp connection.

That would be wrong because Wget is supposed to download the URLs in
the order presented, not in a random order.

It would also not work without the changes Mauro is referring to
because the current code closes the connection after downloading the
file and forgets its state (which is mostly kept on the stack).  Which
means that grouping the same-host URLs without rewriting connection
handling accomplishes nothing.

Another solution would be to add a kludge for FTP the way HTTP
persistent connections are dealt with now.  That approach might work
for a Wget 1.11 if someone is willing to work on it.

Reply via email to