Hrvoje Niksic wrote:
That would be wrong because Wget is supposed to download the URLs in
the order presented, not in a random order.
Its not in random order, its grouped by host and ordered by the user.
It would also not work without the changes Mauro is referring to
because the current code closes the connection after downloading the
file and forgets its state (which is mostly kept on the stack). Which
means that grouping the same-host URLs without rewriting connection
handling accomplishes nothing.
Another solution would be to add a kludge for FTP the way HTTP
persistent connections are dealt with now. That approach might work
for a Wget 1.11 if someone is willing to work on it.
I sugested this aproach because it will be far more easy to add a new
option for enabling
the "reuse ftp connection" feature and groups all URLs from the same
host, than implementing
a full cache system.
This will work because its much like doing:
wget -r -l 0 ftp://ftp.example.com/pub/*
This command will download all the ftp.example.com files using just 1
single connection.
Oliver
--
Oliver Schulze L.
<[EMAIL PROTECTED]>